Browsing by Author "Steinicke, Frank"
Now showing 1 - 2 of 2
Results Per Page
Sort Options
Item Motion In-Betweening with Phase Manifolds(ACM Association for Computing Machinery, 2023) Starke, Paul; Starke, Sebastian; Komura, Taku; Steinicke, Frank; Wang, Huamin; Ye, Yuting; Victor ZordanThis paper introduces a novel data-driven motion in-betweening system to reach target poses of characters by making use of phases variables learned by a Periodic Autoencoder. Our approach utilizes a mixture-of-experts neural network model, in which the phases cluster movements in both space and time with different expert weights. Each generated set of weights then produces a sequence of poses in an autoregressive manner between the current and target state of the character. In addition, to satisfy poses which are manually modified by the animators or where certain end effectors serve as constraints to be reached by the animation, a learned bi-directional control scheme is implemented to satisfy such constraints. The results demonstrate that using phases for motion in-betweening tasks sharpen the interpolated movements, and furthermore stabilizes the learning process. Moreover, using phases for motion in-betweening tasks can also synthesize more challenging movements beyond locomotion behaviors. Additionally, style control is enabled between given target keyframes. Our proposed framework can compete with popular state-of-the-art methods for motion in-betweening in terms of motion quality and generalization, especially in the existence of long transition durations. Our framework contributes to faster prototyping workflows for creating animated character sequences, which is of enormous interest for the game and film industry.Item Safe Walking Zones: Visual Guidance for Redirected Walking in Confined Real-World Spaces(The Eurographics Association, 2018) Lubos, Paul; Bruder, Gerd; Steinicke, Frank; Bruder, Gerd and Yoshimoto, Shunsuke and Cobb, SueWalking is usually considered the most natural form of self-motion in a virtual environment (VE). However, the confined physical workspace of typical virtual reality (VR) labs often prevents natural exploration of larger VEs. Redirected walking (RDW) has been introduced as a potential solution to this restriction, but corresponding techniques often induce enormous manipulations if the workspace is considerably small and lack natural experiences therefore. In this paper we propose a user interface approach that supports natural walking in a potentially infinite virtual scene while confined to a considerably restricted physical workspace. This virtual locomotion technique relies on a safety volume, which is displayed as a semi-transparent half-capsule, inside which the user can walk without manipulations caused by RDW. We designed a circular redirection approach when the user leaves this safety volume that is complemented by a deterrent approach for user guidance outside the safety volume. We discuss in detail the process of transferring user movements inside these regions to the virtual camera in order to enable walking between points of interest in VEs, and we present the results of a usability study in which we evaluate the approach.