Face/Off: Live Facial Puppetry
dc.contributor.author | Weise, Thibaut | en_US |
dc.contributor.author | Li, Hao | en_US |
dc.contributor.author | Gool, Luc Van | en_US |
dc.contributor.author | Pauly, Mark | en_US |
dc.contributor.editor | Eitan Grinspun and Jessica Hodgins | en_US |
dc.date.accessioned | 2016-02-18T11:50:47Z | |
dc.date.available | 2016-02-18T11:50:47Z | |
dc.date.issued | 2009 | en_US |
dc.description.abstract | We present a complete integrated system for live facial puppetry that enables high-resolution real-time facial expression tracking with transfer to another person's face. The system utilizes a real-time structured light scanner that provides dense 3D data and texture. A generic template mesh, fitted to a rigid reconstruction of the actor's face, is tracked offline in a training stage through a set of expression sequences. These sequences are used to build a person-specific linear face model that is subsequently used for online face tracking and expression transfer. Even with just a single rigid pose of the target face, convincing real-time facial animations are achievable. The actor becomes a puppeteer with complete and accurate control over a digital face. | en_US |
dc.description.sectionheaders | Leveraging Motion Capture Data | en_US |
dc.description.seriesinformation | Eurographics/ ACM SIGGRAPH Symposium on Computer Animation | en_US |
dc.identifier.doi | 10.1145/1599470.1599472 | en_US |
dc.identifier.isbn | 978-1-60558-610-6 | en_US |
dc.identifier.issn | 1727-5288 | en_US |
dc.identifier.pages | 7-16 | en_US |
dc.identifier.uri | https://doi.org/10.1145/1599470.1599472 | en_US |
dc.publisher | ACM SIGGRAPH / Eurographics Association | en_US |
dc.title | Face/Off: Live Facial Puppetry | en_US |