Vision-based localization using an edge map extracted from 3D laser range data

Reliable real-time localization is a key component of autonomous industrial vehicle systems. We consider the problem of using on-board vision to determine a vehicle's pose in a known, but non-static, environment. While feasible technologies exist for vehicle localization, many are not suited fo...

Full description

Saved in:
Bibliographic Details
Published in2010 IEEE International Conference on Robotics and Automation pp. 4902 - 4909
Main Authors Borges, Paulo, Zlot, Robert, Bosse, Michael, Nuske, Stephen, Tews, Ashley
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.05.2010
Subjects
Online AccessGet full text
ISBN9781424450381
1424450381
ISSN1050-4729
DOI10.1109/ROBOT.2010.5509517

Cover

Loading…
More Information
Summary:Reliable real-time localization is a key component of autonomous industrial vehicle systems. We consider the problem of using on-board vision to determine a vehicle's pose in a known, but non-static, environment. While feasible technologies exist for vehicle localization, many are not suited for industrial settings where the vehicle must operate dependably both indoors and outdoors and in a range of lighting conditions. We extend the capabilities of an existing vision-based localization system, in a continued effort to improve the robustness, reliability and utility of an automated industrial vehicle system. The vehicle pose is estimated by comparing an edge-filtered version of a video stream to an available 3D edge map of the site. We enhance the previous system by additionally filtering the camera input for straight lines using a Hough transform, observing that the 3D environment map contains only linear features. In addition, we present an automated approach for generating 3D edge maps from laser point clouds, removing the need for manual map surveying and also reducing the time for map generation down from days to minutes. We present extensive localization results in multiple lighting conditions comparing the system with and without the proposed enhancements.
ISBN:9781424450381
1424450381
ISSN:1050-4729
DOI:10.1109/ROBOT.2010.5509517