Mobile Camera Localization Using Aerial-view Images

This paper proposes a method to estimate a mobile camera's position and orientation by referring to the corresponding points between aerial-view images from a GIS database and mobile camera images. The mobile camera images are taken from the user's viewpoint, and the aerial-view images inc...

Full description

Saved in:
Bibliographic Details
Published inInformation and Media Technologies Vol. 9; no. 4; pp. 896 - 904
Main Authors Toriya, Hisatoshi, Kitahara, Itaru, Ohta, Yuichi
Format Journal Article
LanguageEnglish
Published Tokyo Information and Media Technologies Editorial Board 01.01.2014
Japan Science and Technology Agency
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:This paper proposes a method to estimate a mobile camera's position and orientation by referring to the corresponding points between aerial-view images from a GIS database and mobile camera images. The mobile camera images are taken from the user's viewpoint, and the aerial-view images include the same region. To increase the correspondence accuracy, we generate a virtual top-view image that virtually captures the target region overhead of the user by using the intrinsic parameters of the mobile camera and the inertia (gravity) information. We find corresponding points between the virtual top-view and aerial-view images and estimate a homography matrix that transforms the virtual top-view image into aerial-view image. Finally, the mobile camera's position and orientation are estimated by analyzing the matrix. In some cases, however, it is difficult to obtain a sufficient number of correct corresponding points to estimate the correct homography matrix by capturing only a single virtual top-view image. We solve this problem by stitching virtual top-view images to represent a larger ground region. We experimentally implemented our method on a tablet PC and evaluated its effectiveness.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:1881-0896
DOI:10.11185/imt.9.896