Practical Temporal and Stereoscopic Filtering for Real-time Ray Tracing

Loading...
Thumbnail Image
Date
2023
Journal Title
Journal ISSN
Volume Title
Publisher
The Eurographics Association
Abstract
We present a practical method for temporal and stereoscopic filtering that generates stereo-consistent rendering. Existing methods for stereoscopic rendering often reuse samples from one eye for the other or do averaging between the two eyes. These approaches fail in the presence of ray tracing effects such as specular reflections and refractions. We derive a new blending strategy that leverages variance to compute per pixel blending weights for both temporal and stereoscopic rendering. In the temporal domain, our method works well in a low noise context and is robust in the presence of inconsistent motion vectors, where existing methods such as temporal anti-aliasing (TAA) and deep learning super sampling (DLSS) produce artifacts. In the stereoscopic domain, our method provides a new way to ensure consistency between the left and right eyes. The stereoscopic version of our method can be used with our new temporal method or with existing methods such as DLSS and TAA. In all combinations, it reduces the error and significantly increases the consistency between the eyes making it practical for real-time settings such as virtual reality (VR).
Description

CCS Concepts: Computing methodologies -> Rendering; Ray tracing; Antialiasing; Virtual reality

        
@inproceedings{
10.2312:sr.20231129
, booktitle = {
Eurographics Symposium on Rendering
}, editor = {
Ritschel, Tobias
and
Weidlich, Andrea
}, title = {{
Practical Temporal and Stereoscopic Filtering for Real-time Ray Tracing
}}, author = {
Philippi, Henrik
and
Frisvad, Jeppe Revall
and
Jensen, Henrik Wann
}, year = {
2023
}, publisher = {
The Eurographics Association
}, ISSN = {
1727-3463
}, ISBN = {
978-3-03868-229-5
978-3-03868-228-8
}, DOI = {
10.2312/sr.20231129
} }
Citation