PlenopticPoints: Rasterizing Neural Feature Points for High-Quality Novel View Synthesis

No Thumbnail Available
Date
2023
Journal Title
Journal ISSN
Volume Title
Publisher
The Eurographics Association
Abstract
This paper presents a point-based, neural rendering approach for complex real-world objects from a set of photographs. Our method is specifically geared towards representing fine detail and reflective surface characteristics at improved quality over current state-of-the-art methods. From the photographs, we create a 3D point model based on optimized neural feature points located on a regular grid. For rendering, we employ view-dependent spherical harmonics shading, differentiable rasterization, and a deep neural rendering network. By combining a point-based approach and novel regularizers, our method is able to accurately represent local detail such as fine geometry and high-frequency texture while at the same time convincingly interpolating unseen viewpoints during inference. Our method achieves about 7 frames per second at 800×800 pixel output resolution on commodity hardware, putting it within reach for real-time rendering applications.
Description

CCS Concepts: Computing methodologies → Image-based rendering; Point-based models

        
@inproceedings{
10.2312:vmv.20231226
, booktitle = {
Vision, Modeling, and Visualization
}, editor = {
Guthe, Michael
 and
Grosch, Thorsten
}, title = {{
PlenopticPoints: Rasterizing Neural Feature Points for High-Quality Novel View Synthesis
}}, author = {
Hahlbohm, Florian
 and
Kappel, Moritz
 and
Tauscher, Jan-Philipp
 and
Eisemann, Martin
 and
Magnor, Marcus
}, year = {
2023
}, publisher = {
The Eurographics Association
}, ISBN = {
978-3-03868-232-5
}, DOI = {
10.2312/vmv.20231226
} }
Citation
Collections