Appearance-Driven Automatic 3D Model Simplification

Abstract
We present a suite of techniques for jointly optimizing triangle meshes and shading models to match the appearance of reference scenes. This capability has a number of uses, including appearance-preserving simplification of extremely complex assets, conversion between rendering systems, and even conversion between geometric scene representations. We follow and extend the classic analysis-by-synthesis family of techniques: enabled by a highly efficient differentiable renderer and modern nonlinear optimization algorithms, our results are driven to minimize the image-space difference to the target scene when rendered in similar viewing and lighting conditions. As the only signals driving the optimization are differences in rendered images, the approach is highly general and versatile: it easily supports many different forward rendering models such as normal mapping, spatially-varying BRDFs, displacement mapping, etc. Supervision through images only is also key to the ability to easily convert between rendering systems and scene representations. We output triangle meshes with textured materials to ensure that the models render efficiently on modern graphics hardware and benefit from, e.g., hardware-accelerated rasterization, ray tracing, and filtered texture lookups. Our system is integrated in a small Python code base, and can be applied at high resolutions and on large models. We describe several use cases, including mesh decimation, level of detail generation, seamless mesh filtering and approximations of aggregate geometry.
Description

        
@inproceedings{
10.2312:sr.20211293
, booktitle = {
Eurographics Symposium on Rendering - DL-only Track
}, editor = {
Bousseau, Adrien and McGuire, Morgan
}, title = {{
Appearance-Driven Automatic 3D Model Simplification
}}, author = {
Hasselgren, Jon
 and
Munkberg, Jacob
 and
Lehtinen, Jaakko
 and
Aittala, Miika
 and
Laine, Samuli
}, year = {
2021
}, publisher = {
The Eurographics Association
}, ISSN = {
1727-3463
}, ISBN = {
978-3-03868-157-1
}, DOI = {
10.2312/sr.20211293
} }
Citation