Bi-direction Direct RGB-D Visual Odometry

Direct visual odometry (DVO) is an important vision task which aims to obtain the camera motion via minimizing the photometric error across the different correlated images. However, the previous research on DVO rarely considered the motion bias and only calculated using single direction, therefore p...

Full description

Saved in:
Bibliographic Details
Published inApplied artificial intelligence Vol. 34; no. 14; pp. 1137 - 1158
Main Authors Cai, Jiyuan, Luo, Lingkun, Hu, Shiqiang
Format Journal Article
LanguageEnglish
Published Philadelphia Taylor & Francis 05.12.2020
Taylor & Francis Ltd
Taylor & Francis Group
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Direct visual odometry (DVO) is an important vision task which aims to obtain the camera motion via minimizing the photometric error across the different correlated images. However, the previous research on DVO rarely considered the motion bias and only calculated using single direction, therefore potentially ignoring useful information compared with leveraging diverse directions. We assume that jointly considering forward and backward calculation can improve the accuracy of pose estimation. To verify our assumption and solid this contribution, in this paper, we test various combination of direct dense methods, including different error metrics, e.g., (intensity, gradient magnitude), alignment strategies (Forward-Compositional, Inverse-Compositional), and calculation directions (forward, backward, and bi-direction). We further study the issue of motion bias in RGB-D visual odometry and propose four strategy options to improve pose estimation accuracy, e.g., joint bi-direction estimation; two stage bi-direction estimation; transform average with weights; and transform fusion with covariance. We demonstrate the effectiveness and efficiency of our proposed algorithms across a range of popular datasets, e.g., TUM RGB-D and ICL-NUIM, in which we achieve an impressive performance through comparing with state of the art methods and provide benefits for existing RGB-D visual odometry and visual SLAM systems.
ISSN:0883-9514
1087-6545
DOI:10.1080/08839514.2020.1824093