RGB-D fusion: Real-time robust tracking and dense mapping with RGB-D data fusion

We present RGB-D Fusion, a framework which robustly tracks and reconstructs dense textured surfaces of scenes and objects by integrating both color and depth images streamed from a RGB-D sensor into a global colored volume in real-time. To handle failure of the ICP-based tracking approach, KinectFus...

Full description

Saved in:
Bibliographic Details
Published in2014 IEEE/RSJ International Conference on Intelligent Robots and Systems pp. 2749 - 2754
Main Authors Seong-Oh Lee, Hwasup Lim, Hyoung-Gon Kim, Sang Chul Ahn
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.09.2014
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:We present RGB-D Fusion, a framework which robustly tracks and reconstructs dense textured surfaces of scenes and objects by integrating both color and depth images streamed from a RGB-D sensor into a global colored volume in real-time. To handle failure of the ICP-based tracking approach, KinectFusion, due to the lack of sufficient geometric information, we propose a novel approach which registers the input RGB-D image with the colored volume by photometric tracking and geometric alignment. We demonstrate the strengths of the proposed approach compared with the ICP-based approach and show superior performance of our algorithm with real-world data.
ISSN:2153-0858
2153-0866
DOI:10.1109/IROS.2014.6942938