Fusion of IMU and Vision for Absolute Scale Estimation in Monocular SLAM

The fusion of inertial and visual data is widely used to improve an object’s pose estimation. However, this type of fusion is rarely used to estimate further unknowns in the visual framework. In this paper we present and compare two different approaches to estimate the unknown scale parameter in a m...

Full description

Saved in:
Bibliographic Details
Published inJournal of intelligent & robotic systems Vol. 61; no. 1-4; pp. 287 - 299
Main Authors Nützi, Gabriel, Weiss, Stephan, Scaramuzza, Davide, Siegwart, Roland
Format Journal Article
LanguageEnglish
Published Dordrecht Springer Netherlands 01.01.2011
Springer Nature B.V
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:The fusion of inertial and visual data is widely used to improve an object’s pose estimation. However, this type of fusion is rarely used to estimate further unknowns in the visual framework. In this paper we present and compare two different approaches to estimate the unknown scale parameter in a monocular SLAM framework. Directly linked to the scale is the estimation of the object’s absolute velocity and position in 3D. The first approach is a spline fitting task adapted from Jung and Taylor and the second is an extended Kalman filter. Both methods have been simulated offline on arbitrary camera paths to analyze their behavior and the quality of the resulting scale estimation. We then embedded an online multi rate extended Kalman filter in the Parallel Tracking and Mapping (PTAM) algorithm of Klein and Murray together with an inertial sensor. In this inertial/monocular SLAM framework, we show a real time, robust and fast converging scale estimation. Our approach does not depend on known patterns in the vision part nor a complex temporal synchronization between the visual and inertial sensor.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ObjectType-Feature-1
ObjectType-Article-2
content type line 23
ISSN:0921-0296
1573-0409
DOI:10.1007/s10846-010-9490-z