User-Aided Global Registration Method using Geospatial 3D Data for Large-Scale Mobile Outdoor Augmented Reality

Accurate global camera registration is a key requirement for precise AR visualizations in large-scale outdoor AR applications. Existing approaches mostly use complex image-based registration methods requiring large pre-registered databases of geo-referenced images or point clouds that are hardly app...

Full description

Saved in:
Bibliographic Details
Published in2020 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct) pp. 104 - 109
Main Authors Burkard, Simon, Fuchs-Kittowski, Frank
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.11.2020
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Accurate global camera registration is a key requirement for precise AR visualizations in large-scale outdoor AR applications. Existing approaches mostly use complex image-based registration methods requiring large pre-registered databases of geo-referenced images or point clouds that are hardly applicable to large-scale areas. In this paper, we present a simple yet effective user-aided registration method that utilizes common geospatial 3D data to globally register mobile devices. For this purpose, text-based 3D geospatial data including digital 3D terrain and city models is processed into small-scale 3D meshes and displayed in a live AR view. Via two common mobile touch gestures the generated virtual models can be aligned manually to match the actual perception of the real-world environment. Experimental results show that - combined with a robust local visual-inertial tracking system - this approach enables an efficient and accurate global registration of mobile devices in various environments determining the camera attitude with less than one degree deviation while achieving a high degree of immersion through realistic occlusion behavior.
DOI:10.1109/ISMAR-Adjunct51615.2020.00041