Vision-aided UAV navigation using GIS data

This paper proposes a novel vision-aided navigation architecture to aid the inertial navigation system (INS) for accurate unmanned aerial vehicle (UAV) localization. Unlike previous image localization methods such as scene matching and terrain contour matching, our approach registers meaningful obje...

Full description

Saved in:
Bibliographic Details
Published inProceedings of 2010 IEEE International Conference on Vehicular Electronics and Safety pp. 78 - 82
Main Authors Gu, Duo-Yu, Zhu, Cheng-Fei, Guo, Jiang, Li, Shu-Xiao, Chang, Hong-Xing
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.07.2010
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:This paper proposes a novel vision-aided navigation architecture to aid the inertial navigation system (INS) for accurate unmanned aerial vehicle (UAV) localization. Unlike previous image localization methods such as scene matching and terrain contour matching, our approach registers meaningful object-level features extracted from real-time aerial imagery with the data of geographic information system (GIS). Firstly, we extract from aerial images the widely distributed object features including roads, rivers, road intersections, villages, bridges et al.. Then, the extracted image features are delineated as geometrical points and vectors, which coincide with the representation of GIS data. Finally, GIS model is constructed by corresponding geographical object information from GIS data, and visual geometrical features are registered with GIS model to obtain the absolute position of the image. The proposed method adopts GIS as reference data, thus the storage requirement is lower than that of scene matching. In addition, all steps of this approach can be calculated efficiently, while the computational cost of terrain contour matching is very high. Simulation results demonstrate the feasibility of the proposed method for UAV localization.
ISBN:1424471249
9781424471249
DOI:10.1109/ICVES.2010.5550944