Visual loop closing using multi-resolution SIFT grids in metric-topological SLAM

We present an image based simultaneous localization and mapping (SLAM) framework with online, appearance only loop closing. We adopt a layered approach with metric maps over small areas at the local level and a global, graph based abstract topological framework to build consistent maps over large di...

Full description

Saved in:
Bibliographic Details
Published in2009 IEEE Conference on Computer Vision and Pattern Recognition pp. 1438 - 1445
Main Authors Pradeep, Vivek, Medioni, Gerard, Weiland, James
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.06.2009
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:We present an image based simultaneous localization and mapping (SLAM) framework with online, appearance only loop closing. We adopt a layered approach with metric maps over small areas at the local level and a global, graph based abstract topological framework to build consistent maps over large distances. Rao-Blackwellised particle filtering and sparse bundle adjustment are efficiently coupled with a stereo vision based odometry module to construct conditionally independent `submaps' using SIFT features. By extracting keyframes from these submaps, a multiresolution dictionary of distinct features is built online to learn a generative model of appearance and perform loop closure. Creating such a dictionary also enables the system to distinguish between similar regions during loop closure without requiring any offline training, as has been described in other approaches. Furthermore, instead of occupancy or grid maps, we build 3D reconstructions of the world; a model we plan to use as input to a scene interpretation module for providing navigational cues to the visually impaired. We demonstrate the robustness of our SLAM system with indoor and outdoor experiments for full 6 degrees of freedom motion using only a stereo camera in hand, running at 1 Hz on a standard PC.
ISBN:1424439922
9781424439928
ISSN:1063-6919
1063-6919
DOI:10.1109/CVPR.2009.5206769