Tightly Coupled Integration of GNSS and Vision SLAM Using 10-DoF Optimization on Manifold

Vision navigation technique, especially the vision-based simultaneous localization and mapping (V-SLAM), plays a critical role in robotic navigation. As a relative positioning technique, V-SLAM often suffers from drift and scale uncertainty problems which incur bias increasing over time. To overcome...

Full description

Saved in:
Bibliographic Details
Published inIEEE sensors journal Vol. 19; no. 24; pp. 12105 - 12117
Main Authors Gong, Zheng, Ying, Rendong, Wen, Fei, Qian, Jiuchao, Liu, Peilin
Format Journal Article
LanguageEnglish
Published New York IEEE 15.12.2019
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Vision navigation technique, especially the vision-based simultaneous localization and mapping (V-SLAM), plays a critical role in robotic navigation. As a relative positioning technique, V-SLAM often suffers from drift and scale uncertainty problems which incur bias increasing over time. To overcome these drawbacks and to improve the robustness and accuracy of localization, an effective way is to fuse global navigation satellite system (GNSS) with V-SLAM. In this paper, we propose a novel GNSS and SLAM fusion algorithm, which provides ego-motion estimation through tightly coupling GNSS pseudo-range measurements and camera feature points. It first decomposes the pose state into basic motion vectors, based on which an asynchronous tracking is performed. Then, a 10-DoF joint-optimization formulation on manifold is proposed to achieve tight fusion of the raw measurement from camera and GNSS. Finally, this formulation is solved to calculate the ego-motion state. The proposed algorithm is verified on an autonomous ground vehicle in two typical environments. The results demonstrated that, the new algorithm can amend the bias in vision SLAM and constrain the GNSS solution, which achieves a better localization result than the traditional methods.
ISSN:1530-437X
1558-1748
DOI:10.1109/JSEN.2019.2935387