Camera distortion calibration method based on nonspecific planar target
A camera distortion calibration method based on nonspecific planar target and camera relative pose adjustment is proposed. The target can be any planar object and the method only requires the corresponding points of images are coplanar. Compared to the classic target based camera calibration method,...
Saved in:
Published in | 2016 IEEE International Conference on Signal and Image Processing (ICSIP) pp. 452 - 457 |
---|---|
Main Authors | , , |
Format | Conference Proceeding |
Language | English |
Published |
IEEE
01.08.2016
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | A camera distortion calibration method based on nonspecific planar target and camera relative pose adjustment is proposed. The target can be any planar object and the method only requires the corresponding points of images are coplanar. Compared to the classic target based camera calibration method, the world coordinates of points are needn't and pattern's image is nonspecific. It is based on that when camera optics axis is perpendicular to planar target when imaging, x and y coordinates of corresponding points coincide after similarity transformation, the x and y coordinate residual of corresponding points should be zero. Subject to these two constraints, set minimization of square of deviation of the corresponding points coordinates of correspondence points as the optimization target and least square optimization technique is used to compute camera distortion parameters. The steps are as following: First, extracting corresponding point set of the feature points of calibration target. Then the camera coordinate system is to perpendicular to the coordinate system of planar target and the corresponding points should coincide after rotation, translation and scaling of images. Last, the distortion coefficients are combined calculated. Experimental results indicate that the algorithm propose d is effective and reliable. |
---|---|
DOI: | 10.1109/SIPROCESS.2016.7888303 |