Three-dimensional reconstruction method based on inertial measurement unit and RGB-D sensor

The invention discloses a three-dimensional reconstruction method based on an inertial measurement unit and an RGB-D sensor. The method is used for solving the technical problem that an existing three-dimensional reconstruction method based on the RGB-D sensor is poor in real-time property. Accordin...

Full description

Saved in:
Bibliographic Details
Main Authors ZHI RUIRUI, LANG HAO, SUN SHUDONG, ZHONG YAO, HAN QING
Format Patent
LanguageEnglish
Published 25.03.2015
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:The invention discloses a three-dimensional reconstruction method based on an inertial measurement unit and an RGB-D sensor. The method is used for solving the technical problem that an existing three-dimensional reconstruction method based on the RGB-D sensor is poor in real-time property. According to the technical scheme, at first, an SURF algorithm is adopted for extracting key points, then normal vectors of all the key points are calculated, and then three-dimensional fast point feature histogram (FPFH) descriptors are calculated; secondly, the inertial measurement unit is used for obtaining rough motion estimation of the RGB-D sensor, the rough estimation and motion estimation obtained through three-dimensional point cloud calculation are processed through the algorithm to obtain more accurate motion estimation, then the more accurate motion estimation is used as an initial value of an ICP algorithm to carry out iterative calculation; finally, when Loop Closure is detected, an ELCH algorithm is used for optimizing overall situation picture composition fast. The SURF algorithm is utilized for searching for the key points, three-dimensional fast point feature histogram (FPFH) features are adopted as the feature descriptors of the key points, and therefore the speed of the three-dimensional reconstruction algorithm in the feature extracting process is increased; the method is high in practicality.
Bibliography:Application Number: CN20141631074