Browsing by Author "Thollot, Joëlle"
Now showing 1 - 5 of 5
Results Per Page
Sort Options
Item Coherent Mark-based Stylization of 3D Scenes at the Compositing Stage(The Eurographics Association and John Wiley & Sons Ltd., 2021) Garcia, Maxime; Vergne, Romain; Farhat, Mohamed-Amine; Bénard, Pierre; Noûs, Camille; Thollot, Joëlle; Mitra, Niloy and Viola, IvanWe present a novel temporally coherent stylized rendering technique working entirely at the compositing stage. We first generate a distribution of 3D anchor points using an implicit grid based on the local object positions stored in a G-buffer, hence following object motion. We then draw splats in screen space anchored to these points so as to be motion coherent. To increase the perceived flatness of the style, we adjust the anchor points density using a fractalization mechanism. Sudden changes are prevented by controlling the anchor points opacity and introducing a new order-independent blending function. We demonstrate the versatility of our method by showing a large variety of styles thanks to the freedom offered by the splats content and their attributes that can be controlled by any G-buffer.Item Local Light Alignment for Multi-Scale Shape Depiction(The Eurographics Association and John Wiley & Sons Ltd., 2021) Mestres, Nolan; Vergne, Romain; Noûs, Camille; Thollot, Joëlle; Mitra, Niloy and Viola, IvanMotivated by recent findings in the field of visual perception, we present a novel approach for enhancing shape depiction and perception of surface details. We propose a shading-based technique that relies on locally adjusting the direction of light to account for the different components of materials. Our approach ensures congruence between shape and shading flows, leading to an effective enhancement of the perception of shape and details while impairing neither the lighting nor the appearance of materials. It is formulated in a general way allowing its use for multiple scales enhancement in real-time on the GPU, as well as in global illumination contexts. We also provide artists with fine control over the enhancement at each scale.Item Making Gabor Noise Fast and Normalized(The Eurographics Association, 2019) Tavernier, Vincent; Neyret, Fabrice; Vergne, Romain; Thollot, Joëlle; Cignoni, Paolo and Miguel, EderGabor Noise is a powerful procedural texture synthesis technique, but it has two major drawbacks: It is costly due to the high required splat density and not always predictable because properties of instances can differ from those of the process. We bench performance and quality using alternatives for each Gabor Noise ingredient: point distribution, kernel weighting and kernel shape. For this, we introduce 3 objective criteria to measure process convergence, process stationarity, and instance stationarity. We show that minor implementation changes allow for 17-24x speed-up with same or better quality.Item MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics(ACM, 2018) Montesdeoca, Santiago E.; Seah, Hock Soon; Semmo, Amir; Bénard, Pierre; Vergne, Romain; Thollot, Joëlle; Benvenuti, Davide; Aydın, Tunç and Sýkora, DanielWe propose a framework for expressive non-photorealistic rendering of 3D computer graphics: MNPR. Our work focuses on enabling stylization pipelines with a wide range of control, thereby covering the interaction spectrum with real-time feedback. In addition, we introduce control semantics that allow crossstylistic art-direction, which is demonstrated through our implemented watercolor, oil and charcoal stylizations. Our generalized control semantics and their style-specific mappings are designed to be extrapolated to other styles, by adhering to the same control scheme. We then share our implementation details by breaking down our framework and elaborating on its inner workings. Finally, we evaluate the usefulness of each level of control through a user study involving 20 experienced artists and engineers in the industry, who have collectively spent over 245 hours using our system. MNPR is implemented in Autodesk Maya and open-sourced through this publication, to facilitate adoption by artists and further development by the expressive research and development community.Item Motion-coherent stylization with screen-space image filters(ACM, 2018) Bléron, Alexandre; Vergne, Romain; Hurtut, Thomas; Thollot, Joëlle; Aydın, Tunç and Sýkora, DanielOne of the qualities sought in expressive rendering is the 2D impression of the resulting style, called flatness. In the context of 3D scenes, screen-space stylization techniques are good candidates for flatness as they operate in the 2D image plane, after the scene has been rendered into so-called G-buffers. Various stylization filters can be applied in screen-space while making use of the geometrical information contained in G-buffers to ensure motion coherence. However, this means that filtering can only be done inside the rasterized surface of the object. This can be detrimental to some styles that require irregular silhouettes to be convincing. In this paper, we describe a post-processing pipeline that allows stylization filters to extend outside the rasterized footprint of the object by locally "inflating" the data contained in G-buffers. This pipeline is fully implemented on the GPU and can be evaluated at interactive rates. We show how common image filtering techniques, when integrated in our pipeline and in combination with G-buffer data, can be used to reproduce a wide range of "digitally-painted" appearances, such as directed brush strokes with irregular silhouettes, while keeping a degree of motion coherence.