Acquisition, Encoding and Rendering of Material Appearance Using Compact Neural Bidirectional Texture Functions

dc.contributor.authorRainer, Gilles
dc.date.accessioned2022-01-18T15:31:24Z
dc.date.available2022-01-18T15:31:24Z
dc.date.issued2021-11-23
dc.description.abstractThis thesis addresses the problem of photo-realistic rendering of real-world materials. Currently the most faithful approach to render an existing material is scanning the Bidirectional Reflectance Function (BTF), which relies on exhaustive acquisition of reflectance data from the material sample. This incurs heavy costs in terms of both capture times and memory requirements, meaning the main drawback is the lack of practicability. The scope of this thesis is two-fold: implementation of a full BTF pipeline (data acquisition, processing and rendering) and design of a compact neural material representation. We first present our custom BTF scanner, which uses a freely positionable camera and light source to acquire light- and view-dependent textures. During the processing phase, the textures are extracted from the images and rectified onto a unique grid using an estimated proxy surface. At rendering time, the rectification is reverted and the estimated height field additionally allows the preservation of material silhouettes. The main part of the thesis is the development of a neural BTF model that is both compact in memory and practical for rendering. Concretely, the material is modeled by a small fully-connected neural network, parametrized on light and view directions as well as a vector of latent parameters that describe the appearance of the point. We first show that one network can efficiently learn to reproduce the appearance of one given material. The second focus of our work is to find an efficient method to translate BTFs into our representation. Rather than training a new network instance for each new material, the latent space and network are shared, and we use an encoder network to quickly predict latent parameter networks for new, unseen materials. All contributions are geared towards making photo-realistic rendering with BTFs more common and practicable in computer graphics applications like games and virtual environments.en_US
dc.description.sponsorshipChange of Paradigm Ltd. EPSRC grant EP/K023578/1en_US
dc.identifier.urihttps://diglib.eg.org:443/handle/10.2312/2633150
dc.language.isoenen_US
dc.subjectComputer Graphicsen_US
dc.subjectMachine Learningen_US
dc.subjectDeep Learningen_US
dc.subjectNeural Networksen_US
dc.subjectBidirectional Texture Functionsen_US
dc.subjectMaterial Appearanceen_US
dc.subjectRenderingen_US
dc.subjectPhotorealismen_US
dc.subjectAppearance Acquisitionen_US
dc.subjectAppearance Modellingen_US
dc.titleAcquisition, Encoding and Rendering of Material Appearance Using Compact Neural Bidirectional Texture Functionsen_US
dc.typeThesisen_US
Files
Original bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
thesis-revisions-03-14-final.pdf
Size:
142.12 MB
Format:
Adobe Portable Document Format
Description:
Thesis Manuscript
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
1.79 KB
Format:
Item-specific license agreed upon to submission
Description:
Collections