Predicting Perceived Gloss: Do Weak Labels Suffice?

No Thumbnail Available
Date
2024
Journal Title
Journal ISSN
Volume Title
Publisher
The Eurographics Association and John Wiley & Sons Ltd.
Abstract
Estimating perceptual attributes of materials directly from images is a challenging task due to their complex, not fullyunderstood interactions with external factors, such as geometry and lighting. Supervised deep learning models have recently been shown to outperform traditional approaches, but rely on large datasets of human-annotated images for accurate perception predictions. Obtaining reliable annotations is a costly endeavor, aggravated by the limited ability of these models to generalise to different aspects of appearance. In this work, we show how a much smaller set of human annotations (''strong labels'') can be effectively augmented with automatically derived ''weak labels'' in the context of learning a low-dimensional image-computable gloss metric. We evaluate three alternative weak labels for predicting human gloss perception from limited annotated data. Incorporating weak labels enhances our gloss prediction beyond the current state of the art. Moreover, it enables a substantial reduction in human annotation costs without sacrificing accuracy, whether working with rendered images or real photographs.
Description

CCS Concepts: Computing methodologies -> Perception; Dimensionality reduction and manifold learning; Supervised learning

        
@article{
10.1111:cgf.15037
, journal = {Computer Graphics Forum}, title = {{
Predicting Perceived Gloss: Do Weak Labels Suffice?
}}, author = {
Guerrero-Viu, Julia
and
Subias, Jose Daniel
and
Serrano, Ana
and
Storrs, Katherine R.
and
Fleming, Roland W.
and
Masia, Belen
and
Gutierrez, Diego
}, year = {
2024
}, publisher = {
The Eurographics Association and John Wiley & Sons Ltd.
}, ISSN = {
1467-8659
}, DOI = {
10.1111/cgf.15037
} }
Citation
Collections