A novel system for object pose estimation using fused vision and inertial data

•We present a novel system to object pose estimation by fusing vision and inertial data.•Different algorithms for fusing data from inertial sensors and monocular or stereo vision data are described and compared.•The system error propagation property is analyzed.•The performance of the proposed syste...

Full description

Saved in:
Bibliographic Details
Published inInformation fusion Vol. 33; pp. 15 - 28
Main Authors Li, Juan, Besada, Juan A., Bernardos, Ana M., Tarrío, Paula, Casar, José R.
Format Journal Article
LanguageEnglish
Published Elsevier B.V 01.01.2017
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:•We present a novel system to object pose estimation by fusing vision and inertial data.•Different algorithms for fusing data from inertial sensors and monocular or stereo vision data are described and compared.•The system error propagation property is analyzed.•The performance of the proposed system is assessed by simulation and experimental data.•The system can achieve accurate, fast and low-cost 6-DoF pose estimation. Six-degree-of-freedom (6-DoF) pose estimation is of fundamental importance to many applications, such as robotics, indoor tracking and Augmented Reality. Although a number of pose estimation solutions have been proposed, it remains a critical challenge to provide a low-cost, real-time, accurate and easy-to-deploy solution. Addressing this issue, this paper describes a multisensor system for accurate pose estimation that relies on low-cost technologies, in particular on a combination of webcams, inertial sensors and a printable colored fiducial. With the aid of inertial sensors, the system can estimate full pose both with monocular and stereo vision. The system error propagation is analyzed and validated by simulations and experimental tests. Our error analysis and experimental data demonstrate that the proposed system has great potential in practical applications, as it achieves high accuracy (in the order of centimeters for the position estimation and few degrees for the orientation estimation) using the mentioned low-cost sensors, while satisfying tight real-time requirements.
ISSN:1566-2535
1872-6305
DOI:10.1016/j.inffus.2016.04.006