SkyGAN: Towards Realistic Cloud Imagery for Image Based Lighting

Abstract
Achieving photorealism when rendering virtual scenes in movies or architecture visualizations often depends on providing a realistic illumination and background. Typically, spherical environment maps serve both as a natural light source from the Sun and the sky, and as a background with clouds and a horizon. In practice, the input is either a static high-resolution HDR photograph manually captured on location in real conditions, or an analytical clear sky model that is dynamic, but cannot model clouds. Our approach bridges these two limited paradigms: a user can control the sun position and cloud coverage ratio, and generate a realistically looking environment map for these conditions. It is a hybrid data-driven analytical model based on a modified state-of-the-art GAN architecture, which is trained on matching pairs of physically-accurate clear sky radiance and HDR fisheye photographs of clouds. We demonstrate our results on renders of outdoor scenes under varying time, date, and cloud covers.
Description

CCS Concepts: Computing methodologies --> Rendering; Supervised learning; Applied computing --> Earth and atmospheric sciences

        
@inproceedings{
10.2312:sr.20221151
, booktitle = {
Eurographics Symposium on Rendering
}, editor = {
Ghosh, Abhijeet
and
Wei, Li-Yi
}, title = {{
SkyGAN: Towards Realistic Cloud Imagery for Image Based Lighting
}}, author = {
Mirbauer, Martin
and
Rittig, Tobias
and
Iser, Tomáš
and
Krivánek, Jaroslav
and
Šikudová, Elena
}, year = {
2022
}, publisher = {
The Eurographics Association
}, ISSN = {
1727-3463
}, ISBN = {
978-3-03868-187-8
}, DOI = {
10.2312/sr.20221151
} }
Citation