Modeling Surround-aware Contrast Sensitivity

Abstract
Despite advances in display technology, many existing applications rely on psychophysical datasets of human perception gathered using older, sometimes outdated displays. As a result, there exists the underlying assumption that such measurements can be carried over to the new viewing conditions of more modern technology. We have conducted a series of psychophysical experiments to explore contrast sensitivity using a state-of-the-art HDR display, taking into account not only the spatial frequency and luminance of the stimuli but also their surrounding luminance levels. From our data, we have derived a novel surroundaware contrast sensitivity function (CSF), which predicts human contrast sensitivity more accurately. We additionally provide a practical version that retains the benefits of our full model, while enabling easy backward compatibility and consistently producing good results across many existing applications that make use of CSF models. We show examples of effective HDR video compression using a transfer function derived from our CSF, tone-mapping, and improved accuracy in visual difference prediction.
Description

        
@inproceedings{
10.2312:sr.20211303
, booktitle = {
Eurographics Symposium on Rendering - DL-only Track
}, editor = {
Bousseau, Adrien and McGuire, Morgan
}, title = {{
Modeling Surround-aware Contrast Sensitivity
}}, author = {
Yi, Shinyoung
 and
Jeon, Daniel S.
 and
Serrano, Ana
 and
Jeong, Se-Yoon
 and
Kim, Hui-Yong
 and
Gutierrez, Diego
 and
Kim, Min H.
}, year = {
2021
}, publisher = {
The Eurographics Association
}, ISSN = {
1727-3463
}, ISBN = {
978-3-03868-157-1
}, DOI = {
10.2312/sr.20211303
} }
Citation