URetinex-Net: Retinex-based Deep Unfolding Network for Low-light Image Enhancement

Retinex model-based methods have shown to be effective in layer-wise manipulation with well-designed priors for low-light image enhancement. However, the commonly used handcrafted priors and optimization-driven solutions lead to the absence of adaptivity and efficiency. To address these issues, in t...

Full description

Saved in:
Bibliographic Details
Published in2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) pp. 5891 - 5900
Main Authors Wu, Wenhui, Weng, Jian, Zhang, Pingping, Wang, Xu, Yang, Wenhan, Jiang, Jianmin
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.01.2022
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Retinex model-based methods have shown to be effective in layer-wise manipulation with well-designed priors for low-light image enhancement. However, the commonly used handcrafted priors and optimization-driven solutions lead to the absence of adaptivity and efficiency. To address these issues, in this paper, we propose a Retinex-based deep unfolding network (URetinex-Net), which unfolds an optimization problem into a learnable network to decompose a low-light image into reflectance and illumination layers. By formulating the decomposition problem as an implicit priors regularized model, three learning-based modules are carefully designed, responsible for data-dependent initialization, high-efficient unfolding optimization, and user-specified illumination enhancement, respectively. Particularly, the proposed unfolding optimization module, introducing two networks to adaptively fit implicit priors in data-driven manner, can realize noise suppression and details preservation for the final decomposition results. Extensive experiments on real-world low-light images qualitatively and quantitatively demonstrate the effectiveness and superiority of the proposed method over state-of-the-art methods. The code is available at https://github.com/AndersonYong/URetinex-Net.
ISSN:2575-7075
DOI:10.1109/CVPR52688.2022.00581