Art-directing Appearance using an Environment Map Latent Space

Abstract
In look development, environment maps (EMs) are used to verify 3D appearance in varied lighting (e.g., overcast, sunny, and indoor). Artists can only assign one fixed material, making it laborious to edit appearance uniquely for all EMs. Artists can artdirect material and lighting in film post-production. However, this is impossible in dynamic real-time games and live augmented reality (AR), where environment lighting is unpredictable. We present a new workflow to customize appearance variation across a wide range of EM lighting, for live applications. Appearance edits can be predefined, and then automatically adapted to environment lighting changes. We achieve this by learning a novel 2D latent space of varied EM lighting. The latent space lets artists browse EMs in a semantically meaningful 2D view. For different EMs, artists can paint different material and lighting parameter values directly on the latent space. We robustly encode new EMs into the same space, for automatic look-up of the desired appearance. This solves a new problem of preserving art-direction in live applications, without any artist intervention.
Description

        
@inproceedings{
10.2312:pg.20211386
, booktitle = {
Pacific Graphics Short Papers, Posters, and Work-in-Progress Papers
}, editor = {
Lee, Sung-Hee and Zollmann, Stefanie and Okabe, Makoto and Wünsche, Burkhard
}, title = {{
Art-directing Appearance using an Environment Map Latent Space
}}, author = {
Petikam, Lohit
and
Chalmers, Andrew
and
Anjyo, Ken
and
Rhee, Taehyun
}, year = {
2021
}, publisher = {
The Eurographics Association
}, ISBN = {
978-3-03868-162-5
}, DOI = {
10.2312/pg.20211386
} }
Citation