Single-Image Real-Time Rain Removal Based on Depth-Guided Non-Local Features

Rain is a common weather phenomenon that affects environmental monitoring and surveillance systems. According to an established rain model [2], the scene visibility in the rain varies with the depth from the camera, where objects faraway are visually blocked more by the fog than by the rain streaks....

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on image processing Vol. 30; p. 1
Main Authors Hu, Xiaowei, Zhu, Lei, Wang, Tianyu, Fu, Chi-Wing, Heng, Pheng-Ann
Format Journal Article
LanguageEnglish
Published United States IEEE 01.01.2021
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Rain is a common weather phenomenon that affects environmental monitoring and surveillance systems. According to an established rain model [2], the scene visibility in the rain varies with the depth from the camera, where objects faraway are visually blocked more by the fog than by the rain streaks. However, existing datasets and methods for rain removal ignore these physical properties, thus limiting the rain removal efficiency on real photos. In this work, we analyze the visual effects of rain subject to scene depth and formulate a rain imaging model that collectively considers rain streaks and fog. Also, we prepare a dataset called RainCityscapes on real outdoor photos. Furthermore, we design a novel real-time end-to-end deep neural network, for which we train to learn the depth-guided non-local features and to regress a residual map to produce a rain-free output image. We performed various experiments to visually and quantitatively compare our method with several state-of-the-art methods to show its superiority over others.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:1057-7149
1941-0042
1941-0042
DOI:10.1109/TIP.2020.3048625