Real-Time Continuous Image Registration Enabling Ultraprecise 2-D Motion Tracking

In this paper, we present a novel continuous image registration method (CIRM), which yields near-zero bias and has high computational efficiency. It can be realized for real-time position estimation to enable ultraprecise 2-D motion tracking and motion control over a large motion range. As the two v...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on image processing Vol. 22; no. 5; pp. 2081 - 2090
Main Authors PENG CHENG, MENQ, Chia-Hsiang
Format Journal Article
LanguageEnglish
Published New York, NY IEEE 01.05.2013
Institute of Electrical and Electronics Engineers
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:In this paper, we present a novel continuous image registration method (CIRM), which yields near-zero bias and has high computational efficiency. It can be realized for real-time position estimation to enable ultraprecise 2-D motion tracking and motion control over a large motion range. As the two variables of the method are continuous in spatial domain, pixel-level image registration is unnecessary, thus the CIRM can continuously track the moving target according to the incoming target image. When applied to a specific target object, measurement resolution of the method is predicted according to the reference image model of the object along with the variance of the camera's overall image noise. The maximum permissible target speed is proportional to the permissible frame rate, which is limited by the required computational time. The precision, measurement resolution, and computational efficiency of the method are verified through computer simulations and experiments. Specifically, the CIRM is implemented and integrated with a visual sensing system. Near-zero bias, measurement resolution of 0.1 nm (0.0008 pixels), and measurement of one nanometer stepping are demonstrated.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ObjectType-Article-2
ObjectType-Feature-1
ISSN:1057-7149
1941-0042
DOI:10.1109/TIP.2013.2244608