An Energy-Conserving Hair Shading Model Based on Neural Style Transfer

Loading...
Thumbnail Image
Date
2020
Journal Title
Journal ISSN
Volume Title
Publisher
The Eurographics Association
Abstract
We present a novel approach for shading photorealistic hair animation, which is the essential visual element for depicting realistic hairs of virtual characters. Our model is able to shade high-quality hairs quickly by extending the conditional Generative Adversarial Networks. Furthermore, our method is much faster than the previous onerous rendering algorithms and produces fewer artifacts than other neural image translation methods. In this work, we provide a novel energy-conserving hair shading model, which retains the vast majority of semi-transparent appearances and exactly produces the interaction with lights of the scene. Our method is effortless to implement, faster and computationally more efficient than previous algorithms.
Description

        
@inproceedings{
10.2312:pg.20201222
, booktitle = {
Pacific Graphics Short Papers, Posters, and Work-in-Progress Papers
}, editor = {
Lee, Sung-hee and Zollmann, Stefanie and Okabe, Makoto and Wuensche, Burkhard
}, title = {{
An Energy-Conserving Hair Shading Model Based on Neural Style Transfer
}}, author = {
Qiao, Zhi
and
Kanai, Takashi
}, year = {
2020
}, publisher = {
The Eurographics Association
}, ISBN = {
978-3-03868-120-5
}, DOI = {
10.2312/pg.20201222
} }
Citation