Exploring Fast and Flexible Zero-Shot Low-Light Image/Video Enhancement

Loading...
Thumbnail Image
Date
2024
Journal Title
Journal ISSN
Volume Title
Publisher
The Eurographics Association and John Wiley & Sons Ltd.
Abstract
Low-light image/video enhancement is a challenging task when images or video are captured under harsh lighting conditions. Existing methods mostly formulate this task as an image-to-image conversion task via supervised or unsupervised learning. However, such conversion methods require an extremely large amount of data for training, whether paired or unpaired. In addition, these methods are restricted to specific training data, making it difficult for the trained model to enhance other types of images or video. In this paper, we explore a novel, fast and flexible, zero-shot, low-light image or video enhancement framework. Without relying on prior training or relationships among neighboring frames, we are committed to estimating the illumination of the input image/frame by a well-designed network. The proposed zero-shot, low-light image/video enhancement architecture includes illumination estimation and residual correction modules. The network architecture is very concise and does not require any paired or unpaired data during training, which allows low-light enhancement to be performed with several simple iterations. Despite its simplicity, we show that the method is fast and generalizes well to diverse lighting conditions. Many experiments on various images and videos qualitatively and quantitatively demonstrate the advantages of our method over state-of-the-art methods.
Description

CCS Concepts: Computing methodologies → Image processing; Computational photography

        
@article{
10.1111:cgf.15210
, journal = {Computer Graphics Forum}, title = {{
Exploring Fast and Flexible Zero-Shot Low-Light Image/Video Enhancement
}}, author = {
Han, Xianjun
and
Bao, Taoli
and
Yang, Hongyu
}, year = {
2024
}, publisher = {
The Eurographics Association and John Wiley & Sons Ltd.
}, ISSN = {
1467-8659
}, DOI = {
10.1111/cgf.15210
} }
Citation
Collections