EGSR15: 26th Eurographics Symposium on Rendering
Permanent URI for this collection
Browse
Browsing EGSR15: 26th Eurographics Symposium on Rendering by Subject "I.3.3 [Computer Graphics]"
Now showing 1 - 4 of 4
Results Per Page
Sort Options
Item Extracting Microfacet-based BRDF Parameters from Arbitrary Materials with Power Iterations(The Eurographics Association and John Wiley & Sons Ltd., 2015) Dupuy, Jonathan; Heitz, Eric; Iehl, Jean-Claude; Poulin, Pierre; Ostromoukhov, Victor; Jaakko Lehtinen and Derek NowrouzezahraiWe introduce a novel fitting procedure that takes as input an arbitrary material, possibly anisotropic, and automatically converts it to a microfacet BRDF. Our algorithm is based on the property that the distribution of microfacets may be retrieved by solving an eigenvector problem that is built solely from backscattering samples. We show that the eigenvector associated to the largest eigenvalue is always the only solution to this problem, and compute it using the power iteration method. This approach is straightforward to implement, much faster to compute, and considerably more robust than solutions based on nonlinear optimizations. In addition, we provide simple conversion procedures of our fits into both Beckmann and GGX roughness parameters, and discuss the advantages of microfacet slope space to make our fits editable. We apply our method to measured materials from two large databases that include anisotropic materials, and demonstrate the benefits of spatially varying roughness on texture mapped geometric models.Item Modeling Luminance Perception at Absolute Threshold(The Eurographics Association and John Wiley & Sons Ltd., 2015) Kellnhofer, Petr; Ritschel, Tobias; Myszkowski, Karol; Eisemann, Elmar; Seidel, Hans-Peter; Jaakko Lehtinen and Derek NowrouzezahraiWhen human luminance perception operates close to its absolute threshold, i. e., the lowest perceivable absolute values, appearance changes substantially compared to common photopic or scotopic vision. In particular, most observers report perceiving temporally-varying noise. Two reasons are physiologically plausible; quantum noise (due to the low absolute number of photons) and spontaneous photochemical reactions. Previously, static noise with a normal distribution and no account for absolute values was combined with blue hue shift and blur to simulate scotopic appearance on a photopic display for movies and interactive applications (e.g., games). We present a computational model to reproduce the specific distribution and dynamics of ''scotopic noise'' for specific absolute values. It automatically introduces a perceptually-calibrated amount of noise for a specific luminance level and supports animated imagery. Our simulation runs in milliseconds at HD resolution using graphics hardware and favorably compares to simpler alternatives in a perceptual experiment.Item Motion Aware Exposure Bracketing for HDR Video(The Eurographics Association and John Wiley & Sons Ltd., 2015) Gryaditskaya, Yulia; Pouli, Tania; Reinhard, Erik; Myszkowski, Karol; Seidel, Hans-Peter; Jaakko Lehtinen and Derek NowrouzezahraiMobile phones and tablets are rapidly gaining significance as omnipresent image and video capture devices. In this context we present an algorithm that allows such devices to capture high dynamic range (HDR) video. The design of the algorithm was informed by a perceptual study that assesses the relative importance of motion and dynamic range. We found that ghosting artefacts are more visually disturbing than a reduction in dynamic range, even if a comparable number of pixels is affected by each. We incorporated these findings into a real-time, adaptive metering algorithm that seamlessly adjusts its settings to take exposures that will lead to minimal visual artefacts after recombination into an HDR sequence. It is uniquely suitable for real-time selection of exposure settings. Finally, we present an off-line HDR reconstruction algorithm that is matched to the adaptive nature of our real-time metering approach.Item Unifying Color and Texture Transfer for Predictive Appearance Manipulation(The Eurographics Association and John Wiley & Sons Ltd., 2015) Okura, Fumio; Vanhoey, Kenneth; Bousseau, Adrien; Efros, Alexei A.; Drettakis, George; Jaakko Lehtinen and Derek NowrouzezahraiRecent color transfer methods use local information to learn the transformation from a source to an exemplar image, and then transfer this appearance change to a target image. These solutions achieve very successful results for general mood changes, e.g., changing the appearance of an image from ''sunny'' to ''overcast''. However, such methods have a hard time creating new image content, such as leaves on a bare tree. Texture transfer, on the other hand, can synthesize such content but tends to destroy image structure. We propose the first algorithm that unifies color and texture transfer, outperforming both by leveraging their respective strengths. A key novelty in our approach resides in teasing apart appearance changes that can be modeled simply as changes in color versus those that require new image content to be generated. Our method starts with an analysis phase which evaluates the success of color transfer by comparing the exemplar with the source. This analysis then drives a selective, iterative texture transfer algorithm that simultaneously predicts the success of color transfer on the target and synthesizes new content where needed. We demonstrate our unified algorithm by transferring large temporal changes between photographs, such as change of season - e.g., leaves on bare trees or piles of snow on a street - and flooding.