A Novel Tensor-Based Video Rain Streaks Removal Approach via Utilizing Discriminatively Intrinsic Priors

Rain streaks removal is an important issue of the outdoor vision system and has been recently investigated extensively. In this paper, we propose a novel tensor based video rain streaks removal approach by fully considering the discriminatively intrinsic characteristics of rain streaks and clean vid...

Full description

Saved in:
Bibliographic Details
Published in2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR) pp. 2818 - 2827
Main Authors Tai-Xiang Jiang, Ting-Zhu Huang, Xi-Le Zhao, Liang-Jian Deng, Yao Wang
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.07.2017
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Rain streaks removal is an important issue of the outdoor vision system and has been recently investigated extensively. In this paper, we propose a novel tensor based video rain streaks removal approach by fully considering the discriminatively intrinsic characteristics of rain streaks and clean videos, which needs neither rain detection nor time-consuming dictionary learning stage. In specific, on the one hand, rain streaks are sparse and smooth along the raindrops direction, and on the other hand, the clean videos possess smoothness along the rain-perpendicular direction and global and local correlation along time direction. We use the l1 norm to enhance the sparsity of the underlying rain, two unidirectional Total Variation (TV) regularizers to guarantee the different discriminative smoothness, and a tensor nuclear norm and a time directional difference operator to characterize the exclusive correlation of the clean video along time. Alternation direction method of multipliers (ADMM) is employed to solve the proposed concise tensor based convex model. Experiments implemented on synthetic and real data substantiate the effectiveness and efficiency of the proposed method. Under comprehensive quantitative performance measures, our approach outperforms other state-of-the-art methods.
ISSN:1063-6919
1063-6919
DOI:10.1109/CVPR.2017.301