Neural Screen Space Rendering of Direct Illumination

Loading...
Thumbnail Image
Date
2021
Journal Title
Journal ISSN
Volume Title
Publisher
The Eurographics Association
Abstract
Neural rendering is a class of methods that use deep learning to produce novel images of scenes from more limited information than traditional rendering methods. This is useful for information scarce applications like mixed reality or semantic photo synthesis but comes at the cost of control over the final appearance. We introduce the Neural Direct-illumination Renderer (NDR), a neural screen space renderer capable of rendering direct-illumination images of any geometry, with opaque materials, under distant illuminant. The NDR uses screen space buffers describing material, geometry, and illumination as inputs to provide direct control over the output. We introduce the use of intrinsic image decomposition to allow a Convolutional Neural Network (CNN) to learn a mapping from a large number of pixel buffers to rendered images. The NDR predicts shading maps, which are subsequently combined with albedo maps to create a rendered image. We show that the NDR produces plausible images that can be edited by modifying the input maps and marginally outperforms the state of the art while also providing more functionality.
Description

        
@inproceedings{
10.2312:pg.20211385
, booktitle = {
Pacific Graphics Short Papers, Posters, and Work-in-Progress Papers
}, editor = {
Lee, Sung-Hee and Zollmann, Stefanie and Okabe, Makoto and Wünsche, Burkhard
}, title = {{
Neural Screen Space Rendering of Direct Illumination
}}, author = {
Suppan, Christian
and
Chalmers, Andrew
and
Zhao, Junhong
and
Doronin, Alex
and
Rhee, Taehyun
}, year = {
2021
}, publisher = {
The Eurographics Association
}, ISBN = {
978-3-03868-162-5
}, DOI = {
10.2312/pg.20211385
} }
Citation