An Engineering Solution for Multi-sensor Fusion SLAM in Indoor and Outdoor Scenes

In this contribution, an engineering solution for multi-sensor fusion simultaneous localization and mapping (SLAM) is proposed for both indoor and outdoor scenarios, targeting at enhanced robustness, accuracy, and scene adaptability. It consists of three powerful schemes. A scheme-switching mechanis...

Full description

Saved in:
Bibliographic Details
Published in2024 43rd Chinese Control Conference (CCC) pp. 4522 - 4528
Main Authors Jiang, Fengyang, Cheng, Yao, Wang, Huaizhen, Han, Zhe, Huang, Yang, Zhou, Fengyu, Jiang, Jiaju
Format Conference Proceeding
LanguageEnglish
Published Technical Committee on Control Theory, Chinese Association of Automation 28.07.2024
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:In this contribution, an engineering solution for multi-sensor fusion simultaneous localization and mapping (SLAM) is proposed for both indoor and outdoor scenarios, targeting at enhanced robustness, accuracy, and scene adaptability. It consists of three powerful schemes. A scheme-switching mechanism is designed based on a thorough performance evaluation and selects the most suitable multi-sensor fusion SLAM method flexibly based on the characteristics of the scenes and the setting of the robot products. Among the three schemes, LVI-SAM-Stereo is a novel multi-sensor fusion SLAM approach that tightly couples a stereo camera with a 3D light detection and ranging (LiDAR) sensor and an inertial measurement unit (IMU). Its stereo-inertial odometry provides a more robust initial guess for the LiDAR registration compared to its monocular counterpart in critical mapping scenarios. Moreover, a visual verification mechanism for the LiDAR loop closure detection is proposed to effectively avoid incorrect LiDAR loop closures. A thorough evaluation with both datasets and real-world experiments in various indoor and outdoor scenarios verify that our proposed engineering solution achieves a satisfactory performance and meets the engineering requirements for autonomous navigation of robot products.
ISSN:1934-1768
DOI:10.23919/CCC63176.2024.10661451