Temporally Consistent Wide Baseline Facial Performance Capture via Image Warping

Loading...
Thumbnail Image
Date
2015
Journal Title
Journal ISSN
Volume Title
Publisher
The Eurographics Association
Abstract
In this paper, we present a method for detailed temporally consistent facial performance capture that supports any number of arbitrarily placed video cameras. Using a suitable 3D model as reference geometry, our method tracks facial movement and deformation as well as photometric changes due to illumination and shadows. In an analysis-by-synthesis framework, we warp one single reference image per camera to all frames of the sequence thereby drastically reducing temporal drift which is a serious problem for many state-of-the-art approaches. Temporal appearance variations are handled by a photometric estimation component modeling local intensity changes between the reference image and each individual frame. All parameters of the problem are estimated jointly so that we do not require separate estimation steps that might interfere with one another.
Description

        
@inproceedings{
10.2312:vmv.20151263
, booktitle = {
Vision, Modeling & Visualization
}, editor = {
David Bommes and Tobias Ritschel and Thomas Schultz
}, title = {{
Temporally Consistent Wide Baseline Facial Performance Capture via Image Warping
}}, author = {
Kettern, Markus
 and
Hilsmann, Anna
 and
Eisert, Peter
}, year = {
2015
}, publisher = {
The Eurographics Association
}, ISBN = {
978-3-905674-95-8
}, DOI = {
10.2312/vmv.20151263
} }
Citation
Collections