Orthorectification Model for Extra-length Linear Array Imagery

The orthorectification accuracy for an extra-length linear array image is restricted by large distortions of the lens, the strong correlation of exterior orientation parameters (EOPs). This paper presents an orthorectification model for solving the above problems through dividing the distortion into...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on geoscience and remote sensing Vol. 60; p. 1
Main Authors Zhou, Guoqing, Liu, Xingxing
Format Journal Article
LanguageEnglish
Published New York IEEE 2022
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text
ISSN0196-2892
1558-0644
DOI10.1109/TGRS.2022.3223911

Cover

More Information
Summary:The orthorectification accuracy for an extra-length linear array image is restricted by large distortions of the lens, the strong correlation of exterior orientation parameters (EOPs). This paper presents an orthorectification model for solving the above problems through dividing the distortion into two zones and modeling the distortions using two different models, as well as two constrains. The first one is the axial vector rotation coplanar, i.e., the axial vector is located in the imaging plane after two rotations, and the second one is the angle consistent, i.e., the angle between the ground space viewing vectors is equal to the angle between the ideal image space viewing vectors. The Beijing-2(BJ2) satellite images with a linear array of 30,076 pixels and the SPOT- HRV with 6000 pixels are used for verification of our method. The experimental results demonstrate that the orthorectification accuracy of the BJ2 image is improved from 1.938m (about 2.42 pixel) to 1.440m (about 1.80 pixel) and the orthorectification accuracy of the SPOT image is improved from 14.668m (about 1.47 pixel) to 8.657m (about 0.87 pixel). Thus, it can be concluded that the proposed method can achieve a higher accuracy than the traditional orthorectification method does.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0196-2892
1558-0644
DOI:10.1109/TGRS.2022.3223911