RGB-D fusion: Real-time robust tracking and dense mapping with RGB-D data fusion
We present RGB-D Fusion, a framework which robustly tracks and reconstructs dense textured surfaces of scenes and objects by integrating both color and depth images streamed from a RGB-D sensor into a global colored volume in real-time. To handle failure of the ICP-based tracking approach, KinectFus...
Saved in:
Published in | 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems pp. 2749 - 2754 |
---|---|
Main Authors | , , , |
Format | Conference Proceeding |
Language | English |
Published |
IEEE
01.09.2014
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | We present RGB-D Fusion, a framework which robustly tracks and reconstructs dense textured surfaces of scenes and objects by integrating both color and depth images streamed from a RGB-D sensor into a global colored volume in real-time. To handle failure of the ICP-based tracking approach, KinectFusion, due to the lack of sufficient geometric information, we propose a novel approach which registers the input RGB-D image with the colored volume by photometric tracking and geometric alignment. We demonstrate the strengths of the proposed approach compared with the ICP-based approach and show superior performance of our algorithm with real-world data. |
---|---|
ISSN: | 2153-0858 2153-0866 |
DOI: | 10.1109/IROS.2014.6942938 |