WICED 2016

Permanent URI for this collection

Papers
Automatic Lighting Design from Photographic Rules
Jérémy Wambecke, Romain Vergne, Georges-Pierre Bonneau, and Joëlle Thollot
Contact Visualization
Jean-Eudes Marvie, Gael Sourimant, and A. Dufay
Introducing Basic Principles of Haptic Cinematography and Editing
Philippe Guillotel, Fabien Danieau, Julien Fleureau, and Ines Rouxel
Automated Cinematography with Unmanned Aerial Vehicles
Quentin Galvane, Julien Fleureau, Francois-Louis Tariolle, and Philippe Guillotel
Analysing Cinematography with Embedded Constrained Patterns
Hui-Yin Wu and Marc Christie

BibTeX (WICED 2016)
@inproceedings{
10.2312:wiced.20161095,
booktitle = {
Eurographics Workshop on Intelligent Cinematography and Editing},
editor = {
M. Christie and Q. Galvane and A. Jhala and R. Ronfard
}, title = {{
Contact Visualization}},
author = {
Marvie, Jean-Eudes
 and
Sourimant, Gael
 and
Dufay, A.
}, year = {
2016},
publisher = {
The Eurographics Association},
ISSN = {2411-9733},
ISBN = {978-3-03868-005-5},
DOI = {
10.2312/wiced.20161095}
}
@inproceedings{
10.2312:wiced.20161094,
booktitle = {
Eurographics Workshop on Intelligent Cinematography and Editing},
editor = {
M. Christie and Q. Galvane and A. Jhala and R. Ronfard
}, title = {{
Automatic Lighting Design from Photographic Rules}},
author = {
Wambecke, Jérémy
 and
Vergne, Romain
 and
Bonneau, Georges-Pierre
 and
Thollot, Joëlle
}, year = {
2016},
publisher = {
The Eurographics Association},
ISSN = {2411-9733},
ISBN = {978-3-03868-005-5},
DOI = {
10.2312/wiced.20161094}
}
@inproceedings{
10.2312:wiced.20161096,
booktitle = {
Eurographics Workshop on Intelligent Cinematography and Editing},
editor = {
M. Christie and Q. Galvane and A. Jhala and R. Ronfard
}, title = {{
Introducing Basic Principles of Haptic Cinematography and Editing}},
author = {
Guillotel, Philippe
 and
Danieau, Fabien
 and
Fleureau, Julien
 and
Rouxel, Ines
}, year = {
2016},
publisher = {
The Eurographics Association},
ISSN = {2411-9733},
ISBN = {978-3-03868-005-5},
DOI = {
10.2312/wiced.20161096}
}
@inproceedings{
10.2312:wiced.20161098,
booktitle = {
Eurographics Workshop on Intelligent Cinematography and Editing},
editor = {
M. Christie and Q. Galvane and A. Jhala and R. Ronfard
}, title = {{
Analysing Cinematography with Embedded Constrained Patterns}},
author = {
Wu, Hui-Yin
 and
Christie, Marc
}, year = {
2016},
publisher = {
The Eurographics Association},
ISSN = {2411-9733},
ISBN = {978-3-03868-005-5},
DOI = {
10.2312/wiced.20161098}
}
@inproceedings{
10.2312:wiced.20161097,
booktitle = {
Eurographics Workshop on Intelligent Cinematography and Editing},
editor = {
M. Christie and Q. Galvane and A. Jhala and R. Ronfard
}, title = {{
Automated Cinematography with Unmanned Aerial Vehicles}},
author = {
Galvane, Quentin
 and
Fleureau, Julien
 and
Tariolle, Francois-Louis
 and
Guillotel, Philippe
}, year = {
2016},
publisher = {
The Eurographics Association},
ISSN = {2411-9733},
ISBN = {978-3-03868-005-5},
DOI = {
10.2312/wiced.20161097}
}

Browse

Recent Submissions

Now showing 1 - 6 of 6
  • Item
    Contact Visualization
    (The Eurographics Association, 2016) Marvie, Jean-Eudes; Sourimant, Gael; Dufay, A.; M. Christie and Q. Galvane and A. Jhala and R. Ronfard
    We present in this paper a production-oriented technique designed to visualize contact in real-time between 3D objects. The motivation of this work is to provide integrated tools in the production workflow that help artists setting-up scenes and assets without undesired floating objects or inter-penetrations. Such issues can occur easily and remain unnoticed until shading and/or lighting stages are set-up, leading to retakes of the modeling or animation stages. With our solution, artists can visualize in real-time contact between 3D objects while setting-up their assets, thus correcting earlier such misalignments. Being based on a cheap post-processing shader, our solution can be used even on low-end GPUs.
  • Item
    Automatic Lighting Design from Photographic Rules
    (The Eurographics Association, 2016) Wambecke, Jérémy; Vergne, Romain; Bonneau, Georges-Pierre; Thollot, Joëlle; M. Christie and Q. Galvane and A. Jhala and R. Ronfard
    Lighting design is crucial in 3D scenes modeling for its ability to provide cues to understand the objects shape. However a lot of time, skills, trials and errors are required to obtain a desired result. Existing automatic lighting methods for conveying the shape of 3D objects are based either on costly optimizations or on non-realistic shading effects. Also they do not take the material information into account. In this paper, we propose a new method that automatically suggests a lighting setup to reveal the shape of a 3D model, taking into account its material and its geometric properties. Our method is independent from the rendering algorithm. It is based on lighting rules extracted from photography books, applied through a fast and simple geometric analysis. We illustrate our algorithm on objects having different shapes and materials, and we show by both visual and metric evaluation that it is comparable to optimization methods in terms of lighting setups quality. Thanks to its genericity our algorithm could be integrated in any rendering pipeline to suggest appropriate lighting.
  • Item
    WICED 2016: Frontmatter
    (Eurographics Association, 2016) Rémi Ronfard; Marc Christie; Quentin Galvane; Arnav Jhala;
  • Item
    Introducing Basic Principles of Haptic Cinematography and Editing
    (The Eurographics Association, 2016) Guillotel, Philippe; Danieau, Fabien; Fleureau, Julien; Rouxel, Ines; M. Christie and Q. Galvane and A. Jhala and R. Ronfard
    Adding the sense of touch to hearing and seeing would be necessary for a true immersive experience. This is the promise of the growing "4D-cinema" based on motion platforms and others sensory effects (water spray, wind, scent, etc.). Touch provides a new dimension for filmmakers and leads to a new creative area, the haptic cinematography. However design rules are required to use this sensorial modality in the right way for increasing the user experience. This paper addresses this issue, by introducing principles of haptic cinematography editing. The proposed elements are based on early feedback from different creative works performed by the authors (including a student in cinema arts), anticipating the role of haptographers, the experts on haptic content creation. Three full short movies have been augmented with haptic feedback and tested by numerous users, in order to provide the inputs for this introductory paper.
  • Item
    Analysing Cinematography with Embedded Constrained Patterns
    (The Eurographics Association, 2016) Wu, Hui-Yin; Christie, Marc; M. Christie and Q. Galvane and A. Jhala and R. Ronfard
    Cinematography carries messages on the plot, emotion, or more general feeling of the film. Yet cinematographic devices are often overlooked in existing approaches to film analysis. In this paper, we present Embedded Constrained Patterns (ECPs), a dedicated query language to search annotated film clips for sequences that fulfill complex stylistic constraints. ECPs are groups of framing and sequencing constraints defined using vocabulary in film textbooks. Using a set algorithm, all occurrences of the ECPs can be found in annotated film sequences. We use a film clip from the Lord of the Rings to demonstrate a range of ECPs that can be detected, and analyse them in relation to story and emotions in the film.
  • Item
    Automated Cinematography with Unmanned Aerial Vehicles
    (The Eurographics Association, 2016) Galvane, Quentin; Fleureau, Julien; Tariolle, Francois-Louis; Guillotel, Philippe; M. Christie and Q. Galvane and A. Jhala and R. Ronfard
    The rise of Unmanned Aerial Vehicles and their increasing use in the cinema industry calls for the creation of dedicated tools. Though there is a range of techniques to automatically control drones for a variety of applications, none have considered the problem of producing cinematographic camera motion in real-time for shooting purposes. In this paper we present our approach to UAV navigation for autonomous cinematography. The contributions of this research are twofold: (i) we adapt virtual camera control techniques to UAV navigation; (ii) we introduce a drone-independent platform for high-level user interactions that integrates cinematographic knowledge. The results presented in this paper demonstrate the capacities of our tool to capture live movie scenes involving one or two moving actors.