A Simple Stereo Algorithm to Recover Precise Object Boundaries and Smooth Surfaces

For recovering precise object boundaries in area-based stereo matching, there are two problems. One is the so-called "occlusion problem". This can be avoided if we can select only "visible" cameras among many cameras used. Another one is the problem called "boundary overreac...

Full description

Saved in:
Bibliographic Details
Published inInternational journal of computer vision Vol. 47; no. 1-3; pp. 261 - 273
Main Authors Okutomi, Masatoshi, Katayama, Yasuhiro, Oka, Setsuko
Format Journal Article
LanguageEnglish
Published New York Springer Nature B.V 01.04.2002
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:For recovering precise object boundaries in area-based stereo matching, there are two problems. One is the so-called "occlusion problem". This can be avoided if we can select only "visible" cameras among many cameras used. Another one is the problem called "boundary overreach", i.e. the recovered object boundary turns out to be wrongly located away from the real one due to the window's coverage beyond a boundary. This is especially harmful to segmenting objects using depth information. A few approaches have been proposed to solve this problem. However, these techniques tend to degrade on smooth surfaces. That is, there seems to be a trade-off problem between recovering precise object edges and obtaining smooth surfaces. In this paper, we propose a new simple method to solve these problems. Using multiple stereo pairs and multiple windowing, our method detects the region where the boundary overreach is likely to occur (let us call it "BO region") and adopts appropriate methods for the BO and non-BO regions. Although the proposed method is quite simple, the experimental results have shown that it is very effective at recovering both sharp object edges at their correct locations and smooth object surfaces. We also present a sound analysis of the boundary overreach which has not been clearly explained in the past.[PUBLICATION ABSTRACT]
Bibliography:ObjectType-Article-2
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 23
ISSN:0920-5691
1573-1405
DOI:10.1023/A:1014510328154