Unfocused plenoptic metric modeling and calibration

For unfocused plenoptic imaging systems, metric calibration is generally mandatory to achieve high-quality imaging and metrology. In this paper, we present an explicit derivation of an unfocused plenoptic metric model associating a measured light field in the object space with a recorded light field...

Full description

Saved in:
Bibliographic Details
Published inOptics express Vol. 27; no. 15; pp. 20177 - 20198
Main Authors Cai, Zewei, Liu, Xiaoli, Pedrini, Giancarlo, Osten, Wolfgang, Peng, Xiang
Format Journal Article
LanguageEnglish
Published United States 22.07.2019
Online AccessGet full text

Cover

Loading…
More Information
Summary:For unfocused plenoptic imaging systems, metric calibration is generally mandatory to achieve high-quality imaging and metrology. In this paper, we present an explicit derivation of an unfocused plenoptic metric model associating a measured light field in the object space with a recorded light field in the image space to conform physically to the imaging properties of unfocused plenoptic cameras. In addition, the impact of unfocused plenoptic imaging distortion on depth computation was experimentally explored, revealing that radial distortion parameters contain depth-dependent common factors, which were then modeled as depth distortions. Consequently, a complete unfocused plenoptic metric model was established by combining the explicit metric model with the imaging distortion model. A three-step unfocused plenoptic metric calibration strategy, in which the Levenberg-Marquardt algorithm is used for parameter optimization, is correspondingly proposed to determine 12 internal parameters for each microlens unit. Based on the proposed modeling and calibration, the depth measurement precision can be increased to 0.25 mm in a depth range of 300 mm, ensuring the potential applicability of consumer unfocused plenoptic cameras in high-accuracy three-dimensional measurement.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:1094-4087
1094-4087
DOI:10.1364/OE.27.020177