A nonparametric learning approach to range sensing from omnidirectional vision

We present a novel approach to estimating depth from single omnidirectional camera images by learning the relationship between visual features and range measurements available during a training phase. Our model not only yields the most likely distance to obstacles in all directions, but also the pre...

Full description

Saved in:
Bibliographic Details
Published inRobotics and autonomous systems Vol. 58; no. 6; pp. 762 - 772
Main Authors Plagemann, Christian, Stachniss, Cyrill, Hess, Jürgen, Endres, Felix, Franklin, Nathan
Format Journal Article
LanguageEnglish
Published Elsevier B.V 30.06.2010
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:We present a novel approach to estimating depth from single omnidirectional camera images by learning the relationship between visual features and range measurements available during a training phase. Our model not only yields the most likely distance to obstacles in all directions, but also the predictive uncertainties for these estimates. This information can be utilized by a mobile robot to build an occupancy grid map of the environment or to avoid obstacles during exploration—tasks that typically require dedicated proximity sensors such as laser range finders or sonars. We show in this paper how an omnidirectional camera can be used as an alternative to such range sensors. As the learning engine, we apply Gaussian processes, a nonparametric approach to function regression, as well as a recently developed extension for dealing with input-dependent noise. In practical experiments carried out in different indoor environments with a mobile robot equipped with an omnidirectional camera system, we demonstrate that our system is able to estimate range with an accuracy comparable to that of dedicated sensors based on sonar or infrared light.
Bibliography:ObjectType-Article-2
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 23
ISSN:0921-8890
1872-793X
DOI:10.1016/j.robot.2010.02.008