Environment Maps Editing using Inverse Rendering and Adversarial Implicit Functions

dc.contributor.authorD'Orazio, Antonioen_US
dc.contributor.authorSforza, Davideen_US
dc.contributor.authorPellacini, Fabioen_US
dc.contributor.authorMasi, Iacopoen_US
dc.contributor.editorCaputo, Arielen_US
dc.contributor.editorGarro, Valeriaen_US
dc.contributor.editorGiachetti, Andreaen_US
dc.contributor.editorCastellani, Umbertoen_US
dc.contributor.editorDulecha, Tinsae Gebrechristosen_US
dc.date.accessioned2024-11-11T12:48:13Z
dc.date.available2024-11-11T12:48:13Z
dc.date.issued2024
dc.description.abstractEditing High Dynamic Range (HDR) environment maps using an inverse differentiable rendering architecture is a complex inverse problem due to the sparsity of relevant pixels and the challenges in balancing light sources and background. The pixels illuminating the objects are a small fraction of the total image, leading to noise and convergence issues when the optimization directly involves pixel values. HDR images, with pixel values beyond the typical Standard Dynamic Range (SDR), pose additional challenges. Higher learning rates corrupt the background during optimization, while lower learning rates fail to manipulate light sources. Our work introduces a novel method for editing HDR environment maps using a differentiable rendering, addressing sparsity and variance between values. Instead of introducing strong priors that extract the relevant HDR pixels and separate the light sources, or using tricks such as optimizing the HDR image in the log space, we propose to model the optimized environment map with a new variant of implicit neural representations able to handle HDR images. The neural representation is trained with adversarial perturbations over the weights to ensure smooth changes in the output when it receives gradients from the inverse rendering. In this way, we obtain novel and cheap environment maps without relying on latent spaces of expensive generative models, maintaining the original visual consistency. Experimental results demonstrate the method's effectiveness in reconstructing the desired lighting effects while preserving the fidelity of the map and reflections on objects in the scene. Our approach can pave the way to interesting tasks, such as estimating a new environment map given a rendering with novel light sources, maintaining the initial perceptual features, and enabling brush stroke-based editing of existing environment maps. Our code is publicly available at github.com/OmnAI-Lab/R-SIREN.en_US
dc.description.sectionheadersRendering
dc.description.seriesinformationSmart Tools and Applications in Graphics - Eurographics Italian Chapter Conference
dc.identifier.doi10.2312/stag.20241339
dc.identifier.isbn978-3-03868-265-3
dc.identifier.issn2617-4855
dc.identifier.pages11 pages
dc.identifier.urihttps://doi.org/10.2312/stag.20241339
dc.identifier.urihttps://diglib.eg.org/handle/10.2312/stag20241339
dc.publisherThe Eurographics Associationen_US
dc.rightsAttribution 4.0 International License
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/
dc.subjectCCS Concepts: Computing methodologies → Artificial intelligence; Computer graphics; Image manipulation
dc.subjectComputing methodologies → Artificial intelligence
dc.subjectComputer graphics
dc.subjectImage manipulation
dc.titleEnvironment Maps Editing using Inverse Rendering and Adversarial Implicit Functionsen_US
Files
Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
stag20241339.pdf
Size:
9.49 MB
Format:
Adobe Portable Document Format