High-Accuracy and Robust Localization of Large Control Markers for Geometric Camera Calibration

Accurate measurement of the position of features in an image is subject to a fundamental compromise: The features must be both small, to limit the effect of nonlinear distortions, and large, to limit the effect of noise and discretization. This constrains both the accuracy and the robustness of imag...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on pattern analysis and machine intelligence Vol. 31; no. 2; pp. 376 - 383
Main Authors Douxchamps, D., Chihara, K.
Format Journal Article
LanguageEnglish
Published Los Alamitos, CA IEEE 01.02.2009
IEEE Computer Society
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Accurate measurement of the position of features in an image is subject to a fundamental compromise: The features must be both small, to limit the effect of nonlinear distortions, and large, to limit the effect of noise and discretization. This constrains both the accuracy and the robustness of image measurements, which play an important role in geometric camera calibration as well as in all subsequent measurements based on that calibration. In this paper, we present a new geometric camera calibration technique that exploits the complete camera model during the localization of control markers, thereby abolishing the marker size compromise. Large markers allow a dense pattern to be used instead of a simple disc, resulting in a significant increase in accuracy and robustness. When highly planar markers are used, geometric camera calibration based on synthetic images leads to true errors of 0.002 pixels, even in the presence of artifacts such as noise, illumination gradients, compression, blurring, and limited dynamic range. The camera parameters are also accurately recovered, even for complex camera models.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ObjectType-Article-2
ObjectType-Feature-1
content type line 23
ISSN:0162-8828
1939-3539
DOI:10.1109/TPAMI.2008.214