Browsing by Author "Zirr, Tobias"
Now showing 1 - 4 of 4
Results Per Page
Sort Options
Item Distortion-Free Displacement Mapping(The Eurographics Association and John Wiley & Sons Ltd., 2019) Zirr, Tobias; Ritschel, Tobias; Steinberger, Markus and Foley, TimDisplacement mapping is routinely used to add geometric details in a fast and easy-to-control way, both in offline rendering as well as recently in interactive applications such as games. However, it went largely unnoticed (with the exception of McGuire and Whitson [MW08]) that, when applying displacement mapping to a surface with a low-distortion parametrization, this parametrization is distorted as the geometry was changed by the displacement mapping. Typical resulting artifacts are ''rubber band''-like distortion patterns in areas of strong displacement change where a small isotropic area in texture space is mapped to a large anisotropic area in world space. We describe a fast, fully GPU-based two-step procedure to resolve this problem. First, a correction deformation is computed from the displacement map. Second, two variants to apply this correction when computing displacement mapping are proposed. The first variant is backward-compatible and can resolve the artifact in any rendering pipeline without modifying it and without requiring additional computation at render time, but only works for bijective parametrizations. The second variant works for more general parametrizations, but requires to modify the rendering code and incurs a very small computational overhead.Item Perceptually Guided Automatic Parameter Optimization for Interactive Visualization(The Eurographics Association, 2023) Opitz, Daniel; Zirr, Tobias; Dachsbacher, Carsten; Tessari, Lorenzo; Guthe, Michael; Grosch, ThorstenWe propose a new reference-free method for automatically optimizing the parameters of visualization techniques such that the perception of visual structures is improved. Manual tuning may require domain knowledge not only in the field of the analyzed data, but also deep knowledge of the visualization techniques, and thus often becomes challenging as the number of parameters that impact the result grows. To avoid this laborious and difficult task, we first derive an image metric that models the loss of perceived information in the processing of a displayed image by a human observer; good visualization parameters minimize this metric. Our model is loosely based on quantitative studies in the fields of perception and biology covering visual masking, photo receptor sensitivity, and local adaptation. We then pair our metric with a generic parameter tuning algorithm to arrive at an automatic optimization method that is oblivious to the concrete relationship between parameter sets and visualization. We demonstrate our method for several volume visualization techniques, where visual clutter, visibility of features, and illumination are often hard to balance. Since the metric can be efficiently computed using image transformations, it can be applied to many visualization techniques and problem settings in a unified manner, including continuous optimization during interactive visual exploration. We also evaluate the effectiveness of our approach in a user study that validates the improved perception of visual features in results optimized using our model of perception.Item Planetary Shadow-Aware Distance Sampling(The Eurographics Association, 2022) Breyer, Carl; Zirr, Tobias; Ghosh, Abhijeet; Wei, Li-YiDusk and dawn scenes have been difficult for brute force path tracers to handle. We identify that a major source of the inefficiency in explicitly path tracing the atmosphere in such conditions stems from wasting samples on the denser lower parts of atmosphere that get shadowed by the planet before the upper, thinner parts when the star sets below the horizon. We present a technique that overcomes this issue by sampling the star only from the unshadowed segments along rays based on boundaries found by intersecting a cylinder fit to the planet's shadow. We also sample the transmittance by mapping the distances of the boundaries to opacities and sampling the visible segments uniformly in opacity space. Our technique can achieve similar quality compared to brute-force path tracing at around a 60th of the time in such conditions.Item Re‐Weighting Firefly Samples for Improved Finite‐Sample Monte Carlo Estimates(© 2018 The Eurographics Association and John Wiley & Sons Ltd., 2018) Zirr, Tobias; Hanika, Johannes; Dachsbacher, Carsten; Chen, Min and Benes, BedrichSamples with high contribution but low probability density, often called fireflies, occur in all practical Monte Carlo estimators and are part of computing unbiased estimates. For finite‐sample estimates, however, they can lead to excessive variance. Rejecting all samples classified as outliers, as suggested in previous work, leads to estimates that are too low and can cause undesirable artefacts. In this paper, we show how samples can be re‐weighted depending on their contribution and sampling frequency such that the finite‐sample estimate gets closer to the correct expected value and the variance can be controlled. For this, we first derive a theory for how samples should ideally be re‐weighted and that this would require the probability density function of the optimal sampling strategy. As this probability density function is generally unknown, we show how the discrepancy between the optimal and the actual sampling strategy can be estimated and used for re‐weighting in practice. We describe an efficient algorithm that allows for the necessary analysis of per‐pixel sample distributions in the context of Monte Carlo rendering without storing any individual samples, with only minimal changes to the rendering algorithm. It causes negligible runtime overhead, works in constant memory and is well suited for parallel and progressive rendering. The re‐weighting runs as a fast post‐process, can be controlled interactively and our approach is non‐destructive in that the unbiased result can be reconstructed at any time.Samples with high contribution but low probability density, often called fireflies, occur in all practical Monte Carlo estimators and are part of computing unbiased estimates. For finite‐sample estimates, however, they can lead to excessive variance. Rejecting all samples classified as outliers, as suggested in previous work, leads to estimates that are too low and can cause undesirable artefacts. In this paper, we show how samples can be re‐weighted depending on their contribution and sampling frequency such that the finite‐sample estimate gets closer to the correct expected value and the variance can be controlled. For this, we first derive a theory for how samples should ideally be re‐weighted and that this would require the probability density function of the optimal sampling strategy. As this probability density function is generally unknown, we show how the discrepancy between the optimal and the actual sampling strategy can be estimated and used for re‐weighting in practice. We describe an efficient algorithm that allows for the necessary analysis of per‐pixel sample distributions in the context of Monte Carlo rendering without storing any individual samples, with only minimal changes to the rendering algorithm.