32-Issue 7
Permanent URI for this collection
Browse
Browsing 32-Issue 7 by Subject "Color"
Now showing 1 - 3 of 3
Results Per Page
Sort Options
Item Boundary-Aware Extinction Mapping(The Eurographics Association and Blackwell Publishing Ltd., 2013) Gautron, Pascal; Delalandre, Cyril; Marvie, Jean-Eudes; Lecocq, Pascal; B. Levy, X. Tong, and K. YinWe introduce Boundary-Aware Extinction Maps for interactive rendering of massive heterogeneous volumetric datasets. Our approach is based on the projection of the extinction along light rays into a boundary-aware function space, focusing on the most relevant sections of the light paths. This technique also provides an alternative representation of the set of participating media, allowing scattering simulation methods to be applied on arbitrary volume representations. Combined with a simple out-of-core rendering framework, Boundary-Aware Extinction Maps are valuable tools for interactive applications as well as production previsualization and rendering.Item Eye-Centered Color Adaptation in Global Illumination(The Eurographics Association and Blackwell Publishing Ltd., 2013) Gruson, Adrien; Ribardière, Mickael; Cozot, Remi; B. Levy, X. Tong, and K. YinColor adaptation is a well known ability of the human visual system (HVS). Colors are perceived as constant even though the illuminant color changes. Indeed, the perceived color of a diffuse white sheet of paper is still white even though it is illuminated by a single orange tungsten light, whereas it is orange from a physical point of view. Unfortunately global illumination algorithms only focus on the physics aspects of light transport. The ouput of a global illuminantion engine is an image which has to undergo chromatic adaptation to recover the color as perceived by the HVS. In this paper, we propose a new color adaptation method well suited to global illumination. This method estimates the adaptation color by averaging the irradiance color arriving at the eye. Unlike other existing methods, our approach is not limited to the view frustrum, as it considers the illumination from all the scene. Experiments have shown that our method outperforms the state of the art methods.Item Level-of-Detail Streaming and Rendering using Bidirectional Sparse Virtual Texture Functions(The Eurographics Association and Blackwell Publishing Ltd., 2013) Schwartz, Christopher; Ruiters, Roland; Klein, Reinhard; B. Levy, X. Tong, and K. YinBidirectional Texture Functions (BTFs) are among the highest quality material representations available today and thus well suited whenever an exact reproduction of the appearance of a material or complete object is required. In recent years, BTFs have started to find application in various industrial settings and there is also a growing interest in the cultural heritage domain. BTFs are usually measured from real-world samples and easily consist of tens or hundreds of gigabytes. By using data-driven compression schemes, such as matrix or tensor factorization, a more compact but still faithful representation can be derived. This way, BTFs can be employed for real-time rendering of photo-realistic materials on the GPU. However, scenes containing multiple BTFs or even single objects with high-resolution BTFs easily exceed available GPU memory on today's consumer graphics cards unless quality is drastically reduced by the compression. In this paper, we propose the Bidirectional Sparse Virtual Texture Function, a hierarchical level-of-detail approach for the real-time rendering of large BTFs that requires only a small amount of GPU memory. More importantly, for larger numbers or higher resolutions, the GPU and CPU memory demand grows only marginally and the GPU workload remains constant. For this, we extend the concept of sparse virtual textures by choosing an appropriate prioritization, finding a trade off between factorization components and spatial resolution. Besides GPU memory, the high demand on bandwidth poses a serious limitation for the deployment of conventional BTFs. We show that our proposed representation can be combined with an additional transmission compression and then be employed for streaming the BTF data to the GPU from from local storage media or over the Internet. In combination with the introduced prioritization this allows for the fast visualization of relevant content in the users field of view and a consecutive progressive refinement.