Physics-based Reconstruction and Animation of Humans
No Thumbnail Available
Date
2017-09-22
Authors
Journal Title
Journal ISSN
Volume Title
Publisher
Ecole Polytechnique Federale de Lausanne
Abstract
Creating digital representations of humans is of utmost importance for applications ranging from entertainment (video games, movies) to human-computer interaction and even psychiatrical treat- ments. What makes building credible digital doubles difficult is the fact that the human vision system is very sensitive to perceiving the complex expressivity and potential anomalies in body structures and motion.
This thesis will present several projects that tackle these problems from two different perspectives: lightweight acquisition and physics-based simulation. It starts by describing a complete pipeline that allows users to reconstruct fully rigged 3D facial avatars using video data coming from a handheld device (e.g., smartphone). The avatars use a novel two-scale representation composed of blendshapes and dynamic detail maps. They are constructed through an optimization that integrates feature tracking, optical flow, and shape from shading. Continuing along the lines of accessible acquisition systems, we discuss a framework for simultaneous tracking and modeling of articulated human bodies from RGB-D data. We show how L1 regularization can be used to extract semantic information for the body shapes.
In the second half of the thesis, we will deviate from using standard linear reconstruction and animation models, and rather focus on exploiting physics-based techniques that are able to incorporate complex phenomena such as dynamics, collision response and incompressibility of the materials. The first approach we propose assumes that each 3D scan of an actor records his body in a physical steady state and uses a process called inverse physics to extract a volumetric physics-ready anatomical model of him. By using biologically-inspired growth models for the bones, muscles and fat, our method can obtain realistic anatomical reconstructions that can be later on animated using external tracking data such as the one resulting from tracking motion capture markers. This is then extended to a novel physics-based approach for facial reconstruction and animation. We propose a novel facial reconstruction and animation model which simulates biomechanical muscle contractions in a volumetric face model in order to create the facial expressions seen in the input scans. We then show how this approach allows for new avenues of dynamic artistic control, simulation of corrective facial surgery, and interaction with external forces and objects.
Description
Citation
@PhDThesis{7880/THESES, abstract = {Creating digital representations of humans is of utmost importance for applications ranging from entertainment (video games, movies) to human-computer interaction and even psychiatrical treatments. What makes building credible digital doubles difficult is the fact that the human vision system is very sensitive to perceiving the complex expressivity and potential anomalies in body structures and motion. This thesis will present several projects that tackle these problems from two different perspectives: lightweight acquisition and physics-based simulation. It starts by describing a complete pipeline that allows users to reconstruct fully rigged 3D facial avatars using video data coming from a handheld device (e.g., smartphone). The avatars use a novel two-scale representation composed of blendshapes and dynamic detail maps. They are constructed through an optimization that integrates feature tracking, optical flow, and shape from shading. Continuing along the lines of accessible acquisition systems, we discuss a framework for simultaneous tracking and modeling of articulated human bodies from RGB-D data. We show how semantic information can be extracted from the scanned body shapes. In the second half of the thesis, we will deviate from using standard linear reconstruction and animation models, and rather focus on exploiting physics-based techniques that are able to incorporate complex phenomena such as dynamics, collision response and incompressibility of the materials. The first approach we propose assumes that each 3D scan of an actor records his body in a physical steady state and uses a process called inverse physics to extract a volumetric physics-ready anatomical model of him. By using biologically-inspired growth models for the bones, muscles and fat, our method can obtain realistic anatomical reconstructions that can be later on animated using external tracking data such as the one resulting from tracking motion capture markers. This is then extended to a novel physics-based approach for facial reconstruction and animation. We propose a facial animation model which simulates biomechanical muscle contractions in a volumetric head model in order to create the facial expressions seen in the input scans. We then show how this approach allows for new avenues of dynamic artistic control, simulation of corrective facial surgery, and interaction with external forces and objects.}, address = {Lausanne}, affiliation = {EPFL}, author = {Ichim, Alexandru Eugen}, details = {http://infoscience.epfl.ch/record/231042}, doctoral = {EDIC}, documenturl = {https://infoscience.epfl.ch/record/231042/files/EPFL_TH7880.pdf}, doi = {10.5075/epfl-thesis-7880}, extra-id = {11027135}, institute = {IINFCOM}, keywords = {scanning; registration; face reconstruction; body reconstruction; simulation; facial animation; physics-based animation; body animation; face modeling; body modeling}, language = {eng}, oai-id = {oai:infoscience.epfl.ch:231042}, oai-set = {IC}, original-unit = {LGG}, pagecount = {175}, production-date = 2017, public-defence-date = 2017, publisher = {EPFL}, school = {IC}, status = {PUBLISHED}, submitter = {108898; 108898}, thesis-id = {7880}, thesis-note = {Prof. Martin Jaggi (président) ; Prof. Mark Pauly (directeur de thÚse) ; Prof. Pascal Fua, Prof. Ladislav Kavan, Dr Thabo Beeler (rapporteurs)}, title = {Physics-based {R}econstruction and {A}nimation of {H}umans}, unit = {LGG}, urn = {urn:nbn:ch:bel-epfl-thesis7880-9}, year = 2017 }