Learning Scene Illumination by Pairwise Photos from Rear and Front Mobile Cameras

dc.contributor.authorCheng, Dachuanen_US
dc.contributor.authorShi, Jianen_US
dc.contributor.authorChen, Yanyunen_US
dc.contributor.authorDeng, Xiaomingen_US
dc.contributor.authorZhang, Xiaopengen_US
dc.contributor.editorFu, Hongbo and Ghosh, Abhijeet and Kopf, Johannesen_US
dc.date.accessioned2018-10-07T14:59:17Z
dc.date.available2018-10-07T14:59:17Z
dc.date.issued2018
dc.description.abstractIllumination estimation is an essential problem in computer vision, graphics and augmented reality. In this paper, we propose a learning based method to recover low-frequency scene illumination represented as spherical harmonic (SH) functions by pairwise photos from rear and front cameras on mobile devices. An end-to-end deep convolutional neural network (CNN) structure is designed to process images on symmetric views and predict SH coefficients. We introduce a novel Render Loss to improve the rendering quality of the predicted illumination. A high quality high dynamic range (HDR) panoramic image dataset was developed for training and evaluation. Experiments show that our model produces visually and quantitatively superior results compared to the state-of-the-arts. Moreover, our method is practical for mobile-based applications.en_US
dc.description.number7
dc.description.sectionheadersAppearance and Illumination
dc.description.seriesinformationComputer Graphics Forum
dc.description.volume37
dc.identifier.doi10.1111/cgf.13561
dc.identifier.issn1467-8659
dc.identifier.pages213-221
dc.identifier.urihttps://doi.org/10.1111/cgf.13561
dc.identifier.urihttps://diglib.eg.org:443/handle/10.1111/cgf13561
dc.publisherThe Eurographics Association and John Wiley & Sons Ltd.en_US
dc.subjectHuman
dc.subjectcentered computing
dc.subjectMixed / augmented reality
dc.subjectComputing methodologies
dc.subjectScene understanding
dc.subjectRendering
dc.titleLearning Scene Illumination by Pairwise Photos from Rear and Front Mobile Camerasen_US
Files
Collections