Analysis and Synthesis of 3D Shape Families via Deep-learned Generative Models of Surfaces

Loading...
Thumbnail Image
Date
2015
Journal Title
Journal ISSN
Volume Title
Publisher
The Eurographics Association and John Wiley & Sons Ltd.
Abstract
We present a method for joint analysis and synthesis of geometrically diverse 3D shape families. Our method first learns part-based templates such that an optimal set of fuzzy point and part correspondences is computed between the shapes of an input collection based on a probabilistic deformation model. In contrast to previous template-based approaches, the geometry and deformation parameters of our part-based templates are learned from scratch. Based on the estimated shape correspondence, our method also learns a probabilistic generative model that hierarchically captures statistical relationships of corresponding surface point positions and parts as well as their existence in the input shapes. A deep learning procedure is used to capture these hierarchical relationships. The resulting generative model is used to produce control point arrangements that drive shape synthesis by combining and deforming parts from the input collection. The generative model also yields compact shape descriptors that are used to perform fine-grained classification. Finally, it can be also coupled with the probabilistic deformation model to further improve shape correspondence. We provide qualitative and quantitative evaluations of our method for shape correspondence, segmentation, fine-grained classification and synthesis. Our experiments demonstrate superior correspondence and segmentation results than previous state-of-the-art approaches.
Description

        
@article{
10.1111:cgf.12694
, journal = {Computer Graphics Forum}, title = {{
Analysis and Synthesis of 3D Shape Families via Deep-learned Generative Models of Surfaces
}}, author = {
Huang, Haibin
 and
Kalogerakis, Evangelos
 and
Marlin, Benjamin
}, year = {
2015
}, publisher = {
The Eurographics Association and John Wiley & Sons Ltd.
}, DOI = {
10.1111/cgf.12694
} }
Citation