AIPNet: Image-to-Image Single Image Dehazing With Atmospheric Illumination Prior

The atmospheric scattering and absorption gives rise to the natural phenomenon of haze, which severely affects the visibility of scenery. Thus, the image taken by the camera can easily lead to over brightness and ambiguity. To resolve an ill-posed and intractable problem of single image dehazing, we...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on image processing Vol. 28; no. 1; pp. 381 - 393
Main Authors Wang, Anna, Wang, Wenhui, Liu, Jinglu, Gu, Nanhui
Format Journal Article
LanguageEnglish
Published United States IEEE 01.01.2019
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:The atmospheric scattering and absorption gives rise to the natural phenomenon of haze, which severely affects the visibility of scenery. Thus, the image taken by the camera can easily lead to over brightness and ambiguity. To resolve an ill-posed and intractable problem of single image dehazing, we propose a straightforward but remarkable prior-atmospheric illumination prior in this paper. The extensive statistical experiments for different colorspaces and theoretical analyses indicate that the atmospheric illumination in hazy weather mainly has a great influence on the luminance channel in YCrCb colorspace, and has less impact on the chrominance channels. According to this prior, we try to maintain the intrinsic color of hazy scene and enhance its visual contrast. To this end, we apply the multiscale convolutional networks that can automatically identify hazy regions and restore deficient texture information. Compared with previous methods, the deep CNNs not only achieve an end-to-end trainable model, but also accomplish an easy image-to-image system architecture. The extensive comparisons and analyses with existing approaches demonstrate that the proposed approach achieves the state-of-the-art performance on several dehazing effects.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:1057-7149
1941-0042
DOI:10.1109/TIP.2018.2868567