Improving path planning and mapping based on stereo vision and lidar

2D laser range finders have been widely used in mobile robot navigation. However, their use is limited to simple environments containing objects of regular geometry and shapes. Stereo vision, instead, provides 3D structural data of complex objects. In this paper, measurements from a stereo vision ca...

Full description

Saved in:
Bibliographic Details
Published in2008 10th International Conference on Control Automation Robotics and Vision pp. 384 - 389
Main Authors Moghadam, P., Wijesoma, W.S., Dong Jun Feng
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.12.2008
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:2D laser range finders have been widely used in mobile robot navigation. However, their use is limited to simple environments containing objects of regular geometry and shapes. Stereo vision, instead, provides 3D structural data of complex objects. In this paper, measurements from a stereo vision camera system and a 2D laser range finder are fused to dynamically plan and navigate a mobile robot in cluttered and complex environments. A robust estimator is used to detect obstacles and ground plane in 3D world model in front of the robot based on disparity information from stereo vision system. Based on this 3D world model, 2D cost map is generated. A separate 2D cost map is also generated by 2D laser range finder. Then we use a grid-based occupancy map approach to fuse the complementary information provided by the 2D laser range finder and stereo vision system. Since the two sensors may detect different parts of an object, two different fusion strategies are addressed here. The final occupancy grid map is simultaneously used for obstacle avoidance and path planning. Experimental results obtained form a Point Grey's Bumblebee stereo camera and a SICK LDOEM laser range finder mounted on a Packbot robot are provided to demonstrate the effectiveness of the proposed lidar and stereo vision fusion strategy for mobile robot navigation.
ISBN:9781424422869
1424422868
DOI:10.1109/ICARCV.2008.4795550