Self-calibrating multi-camera visual-inertial fusion for autonomous MAVs

We address the important problem of achieving robust and easy-to-deploy visual state estimation for micro aerial vehicles (MAVs) operating in complex environments. We use a sensor suite consisting of multiple cameras and an IMU to maximize perceptual awareness of the surroundings and provide suffici...

Full description

Saved in:
Bibliographic Details
Published in2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) pp. 4984 - 4991
Main Authors Zhenfei Yang, Tianbo Liu, Shaojie Shen
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.10.2016
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:We address the important problem of achieving robust and easy-to-deploy visual state estimation for micro aerial vehicles (MAVs) operating in complex environments. We use a sensor suite consisting of multiple cameras and an IMU to maximize perceptual awareness of the surroundings and provide sufficient redundancy against sensor failures. Our approach starts with an online initialization procedure that simultaneously estimates the transformation between each camera and the IMU, as well as the initial velocity and attitude of the platform, without any prior knowledge about the mechanical configuration of the sensor suite. Based on the initial calibrations, a tightly-coupled, optimization-based, generalized multi-camera-inertial fusion method runs onboard the MAV with online camera-IMU calibration refinement and identification of sensor failures. Our approach dynamically configures the system into monocular, stereo, or other multi-camera visual-inertial settings, with their respective perceptual advantages, based on the availability of visual measurements. We show that even under random camera failures, our method can be used for feedback control of the MAVs. We highlight our approach in challenging indoor-outdoor navigation tasks with large variations in vehicle height and speed, scene depth, and illumination.
ISSN:2153-0866
DOI:10.1109/IROS.2016.7759732