A Novel One-To-One Framework for Relative Camera Pose Estimation
To address the challenge of relative camera pose estimation, many permutation-invariant neural networks have been developed to process sparse correspondences with constant latency. These networks typically utilize an n -to- n framework, where n putative correspondences from the same image pairs are...
Saved in:
Published in | IEEE access p. 1 |
---|---|
Main Authors | , |
Format | Journal Article |
Language | English |
Published |
IEEE
07.10.2024
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | To address the challenge of relative camera pose estimation, many permutation-invariant neural networks have been developed to process sparse correspondences with constant latency. These networks typically utilize an n -to- n framework, where n putative correspondences from the same image pairs are placed in distinct batch instances without any specific order. This uncorrelated set-type input structure does not sufficiently facilitate the extraction of contextual information for the correspondences. In this paper, we introduce a novel one -to- one framework designed to maximize context interaction within the network. Our framework prioritizes providing specialized context for each correspondence and enhancing the interaction of context data and correspondence data through a carefully designed input structure and network architecture schema. We conducted a series of experiments using various architectures within the one -to- one framework. Our results demonstrate that one -to- one networks not only matches but often surpasses the performance of traditional n -to- n networks, highlighting the one -to- one framework's significant potential and efficacy. To ensure a fair comparison, all one -to- one and n -to- n networks were trained on Google's Tensor Processing Units (TPUs). Notably, the memory capacity of a single TPUv4 device is sufficient to train one -to- one networks presented without generating TPU pods using multiple devices. |
---|---|
ISSN: | 2169-3536 2169-3536 |
DOI: | 10.1109/ACCESS.2024.3476238 |