EG 2016 - Short Papers
Permanent URI for this collection
Browse
Browsing EG 2016 - Short Papers by Subject "Animation"
Now showing 1 - 2 of 2
Results Per Page
Sort Options
Item Example-based Body Model Optimization and Skinning(The Eurographics Association, 2016) Fechteler, Philipp; Hilsmann, Anna; Eisert, Peter; T. Bashford-Rogers and L. P. SantosIn this paper, we present an example-based framework for the generation of a realistic kinematic 3D human body model that optimizes shape, pose and skinning parameters. For enhanced realism, the skinning is realized as a combination of Linear Blend Skinning (LBS) and Dual quaternion Linear Blending (DLB) which nicely compensates the deficiencies of using only one of these approaches (e.g. candy wrapper, bulging artifacts) and supports interpolation of more than two joint transformations. The optimization framework enforces two objectives: resembling both shape and pose as closely as possible by iteratively minimizing the objective function with respect to (a) the vertices, (b) the skinning weights and (c) the joint parameters. Smoothness is ensured by using a weighted Laplacian besides a typical data term in the objective function, which introduces the only parameter to be specified. With experimental results on publicly available datasets we demonstrate the effectiveness of the resulting shape model, exposing convincing naturalism. By using examples for the optimization of all parameters, our framework is easy to use and does not require sophisticated parameter tuning or user intervention.Item Robust Transmission of Motion Capture Data using Interleaved LDPC and Inverse Kinematics(The Eurographics Association, 2016) Furtado, Antonio Carlos; Cheng, Irene; Dufaux, Frederic; Basu, Anup; T. Bashford-Rogers and L. P. SantosRecent advances in smart-sensor technology have improved precision in Motion Capture (MoCap) data for realistic animation. However, precision also imposes challenges on bandwidth. While research efforts have focussed on MoCap compression in recent years, little attention has been given to lossy transmission taking advantage of the human perceptual threshold, which allows many online applications, e.g., interactive games, on-demand broadcast, movies and tutoring using dynamic motion sequences. Given the growing applications on mobile devices and wireless networks, associated with insufficient bandwidth, unreliable connection and potential interference or shadowing, data loss is inevitable. We introduce a new Representation for MoCap data, integrating Interleaved Low-Density Parity-Check (I-LDPC), with Keyframe-based Interpolation and Inverse Kinematics, to better address the problem of MoCap data loss during transmission. We believe this is the first study to address robust transmission of MoCap data considering loss. Experimental results assessed using mean opinion scores demonstrate that our approach achieves substantial improvement over alternative transmission methods.