Camera pose estimation framework for array‐structured images

Despite the significant progress in camera pose estimation and structure‐from‐motion reconstruction from unstructured images, methods that exploit a priori information on camera arrangements have been overlooked. Conventional state‐of‐the‐art methods do not exploit the geometric structure to recover...

Full description

Saved in:
Bibliographic Details
Published inETRI journal Vol. 44; no. 1; pp. 10 - 23
Main Authors Shin, Min‐Jung, Park, Woojune, Kim, Jung Hee, Kim, Joonsoo, Yun, Kuk‐Jin, Kang, Suk‐Ju
Format Journal Article
LanguageEnglish
Published Electronics and Telecommunications Research Institute (ETRI) 01.02.2022
한국전자통신연구원
Subjects
Online AccessGet full text
ISSN1225-6463
2233-7326
DOI10.4218/etrij.2021-0303

Cover

More Information
Summary:Despite the significant progress in camera pose estimation and structure‐from‐motion reconstruction from unstructured images, methods that exploit a priori information on camera arrangements have been overlooked. Conventional state‐of‐the‐art methods do not exploit the geometric structure to recover accurate camera poses from a set of patch images in an array for mosaic‐based imaging that creates a wide field‐of‐view image by sewing together a collection of regular images. We propose a camera pose estimation framework that exploits the array‐structured image settings in each incremental reconstruction step. It consists of the two‐way registration, the 3D point outlier elimination and the bundle adjustment with a constraint term for consistent rotation vectors to reduce reprojection errors during optimization. We demonstrate that by using individual images' connected structures at different camera pose estimation steps, we can estimate camera poses more accurately from all structured mosaic‐based image sets, including omnidirectional scenes.
Bibliography:Funding information
Min‐jung Shin, Woojune Park, and Jung Hee Kim should be considered joint first authors.
Institute of Information & Communications Technology Planning & Evaluation, Grant/Award Number: 2018‐0‐00207; Information Technology Research Center (ITRC), Grant/Award Number: IITP‐2021‐2018‐0‐01421; National Research Foundation of Korea (NRF), Grant/Award Number: 2021R1A2C1004208
https://doi.org/10.4218/etrij.2021-0303
ISSN:1225-6463
2233-7326
DOI:10.4218/etrij.2021-0303