From Pixels to Precision: A Survey of Monocular Visual Odometry in Digital Twin Applications

This survey provides a comprehensive overview of traditional techniques and deep learning-based methodologies for monocular visual odometry (VO), with a focus on displacement measurement applications. This paper outlines the fundamental concepts and general procedures for VO implementation, includin...

Full description

Saved in:
Bibliographic Details
Published inSensors (Basel, Switzerland) Vol. 24; no. 4; p. 1274
Main Authors Neyestani, Arman, Picariello, Francesco, Ahmed, Imran, Daponte, Pasquale, De Vito, Luca
Format Journal Article
LanguageEnglish
Published Switzerland MDPI AG 17.02.2024
MDPI
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:This survey provides a comprehensive overview of traditional techniques and deep learning-based methodologies for monocular visual odometry (VO), with a focus on displacement measurement applications. This paper outlines the fundamental concepts and general procedures for VO implementation, including feature detection, tracking, motion estimation, triangulation, and trajectory estimation. This paper also explores the research challenges inherent in VO implementation, including scale estimation and ground plane considerations. The scientific literature is rife with diverse methodologies aiming to overcome these challenges, particularly focusing on the problem of accurate scale estimation. This issue has been typically addressed through the reliance on knowledge regarding the height of the camera from the ground plane and the evaluation of feature movements on that plane. Alternatively, some approaches have utilized additional tools, such as LiDAR or depth sensors. This survey of approaches concludes with a discussion of future research challenges and opportunities in the field of monocular visual odometry.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
This paper is an extended version of our paper published in 2023 IEEE International Workshop on Metrology for Living Environment (MetroLivEnv), Milano, Italy, 29–31 May 2023.
ISSN:1424-8220
1424-8220
DOI:10.3390/s24041274