The Use of Vanishing Point for the Classification of Reflections From Foreground Mask in Videos

Extraction of foreground is a basic task in surveillance video analysis. In most real cases, its performance is heavily based on the efficiency of shadow detection and on the analysis of lighting conditions and reflections caused by mirrors or other reflective surfaces. This correspondence is focuse...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on image processing Vol. 18; no. 6; pp. 1366 - 1372
Main Authors Havasi, L., Szlavik, Z., Sziranyi, T.
Format Journal Article
LanguageEnglish
Published New York, NY IEEE 01.06.2009
Institute of Electrical and Electronics Engineers
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Extraction of foreground is a basic task in surveillance video analysis. In most real cases, its performance is heavily based on the efficiency of shadow detection and on the analysis of lighting conditions and reflections caused by mirrors or other reflective surfaces. This correspondence is focused on the improvement of foreground extraction in the case of planar reflective surfaces. We show that the geometric model of a scene with a planar reflective surface is reduced to the estimation of vanishing-point for the case of an auto-epipolar (skew-symmetric) fundamental matrix. The correspondences for the vanishing-point estimation are extracted from motion statistics. The knowledge of the position of the vanishing point allows us to integrate the geometric model and the motion statistics into image foreground-extraction to separate foreground from reflections, and thus to achieve better performance. The experiments confirm the accuracy of the vanishing point and the improvement of the foreground image mask by removing reflected object parts.
Bibliography:SourceType-Other Sources-1
content type line 63
ObjectType-Correspondence-1
ObjectType-Article-2
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 23
ISSN:1057-7149
1941-0042
DOI:10.1109/TIP.2009.2017137