Image filtering for interactive level-of-abstraction visualization of 3D scenes

Loading...
Thumbnail Image
Date
2014
Journal Title
Journal ISSN
Volume Title
Publisher
ACM
Abstract
Texture mapping is a key technology in computer graphics for visual design of rendered 3D scenes. An effective information transfer of surface properties, encoded by textures, however, depends significantly on how important information is highlighted and cognitively processed by the user in an application context. Edge-preserving image filtering is a promising approach to address this concern while preserving global salient structures. Much research has focused on applying image filters in a post-process stage to foster an artistically stylized rendering, but these approaches are generally not able to preserve depth cues important for 3D visualization (e.g., texture gradient). To this end, filtering that processes texture data coherently with respect to linear perspective and spatial relationships is required. In this work, we present a system that enables to process textured 3D scenes with perspective coherence by arbitrary image filters. We propose decoupled deferred texturing with (1) caching strategies to interactively perform image filtering prior to texture mapping, and (2) for each mipmap level separately to enable a progressive level of abstraction. We demonstrate the potentials of our methods on several applications, including illustrative visualization, focus+context visualization, geometric detail removal, and depth of field. Our system supports frame-to-frame coherence, order-independent transparency, multitexturing, and content-based filtering.
Description

        
@inproceedings{
10.1145:2630099.2630101
, booktitle = {
Eurographics Workshop on Computational Aesthetics in Graphics, Visualization and Imaging
}, editor = {
Paul Rosin
}, title = {{
Image filtering for interactive level-of-abstraction visualization of 3D scenes
}}, author = {
Semmo, Amir
 and
Döllner, Jürgen
}, year = {
2014
}, publisher = {
ACM
}, ISSN = {
1816-0859
}, ISBN = {
978-1-4503-3019-0
}, DOI = {
10.1145/2630099.2630101
} }
Citation