Temporally Consistent Disparity and Optical Flow via Efficient Spatio-temporal Filtering

This paper presents a new efficient algorithm for computing temporally consistent disparity maps from video footage. Our method is motivated by recent work [1] that achieves high quality stereo results by smoothing disparity costs with a fast edge-preserving filter. This previous approach was design...

Full description

Saved in:
Bibliographic Details
Published inAdvances in Image and Video Technology Vol. 7087; pp. 165 - 177
Main Authors Hosni, Asmaa, Rhemann, Christoph, Bleyer, Michael, Gelautz, Margrit
Format Book Chapter
LanguageEnglish
Published Germany Springer Berlin / Heidelberg 2011
Springer Berlin Heidelberg
SeriesLecture Notes in Computer Science
Subjects
Online AccessGet full text
ISBN9783642253669
3642253660
ISSN0302-9743
1611-3349
DOI10.1007/978-3-642-25367-6_15

Cover

More Information
Summary:This paper presents a new efficient algorithm for computing temporally consistent disparity maps from video footage. Our method is motivated by recent work [1] that achieves high quality stereo results by smoothing disparity costs with a fast edge-preserving filter. This previous approach was designed to work with single static image pairs and does not maintain temporal coherency of disparity maps when applied to video streams. The main contribution of our work is to transfer this concept to the spatio-temporal domain in order to efficiently achieve temporally consistent disparity maps, where disparity changes are aligned with spatio-temporal edges of the video sequence. We further show that our method can be used as spatio-temporal regularizer for optical flow estimation. Our approach can be implemented efficiently, achieving real-time results for stereo matching. Quantitative and qualitative results demonstrate that our approach (i) considerably improves over frame-by-frame methods for both stereo and optical flow; and (ii) outperforms the state-of-the-art for local space-time stereo approaches.
ISBN:9783642253669
3642253660
ISSN:0302-9743
1611-3349
DOI:10.1007/978-3-642-25367-6_15