Adaptive Fine Distortion Correction Method for Stereo Images of Skin Acquired with a Mobile Phone

With the development of the mobile phone, we can acquire high-resolution images of the skin to observe its detailed features using a mobile camera. We acquire stereo images using a mobile camera to enable a three-dimensional (3D) analysis of the skin surface. However, geometric changes in the observ...

Full description

Saved in:
Bibliographic Details
Published inSensors (Basel, Switzerland) Vol. 20; no. 16; p. 4492
Main Authors Moon, Cho-I, Lee, Onseok
Format Journal Article
LanguageEnglish
Published Switzerland MDPI AG 11.08.2020
MDPI
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:With the development of the mobile phone, we can acquire high-resolution images of the skin to observe its detailed features using a mobile camera. We acquire stereo images using a mobile camera to enable a three-dimensional (3D) analysis of the skin surface. However, geometric changes in the observed skin structure caused by the lens distortion of the mobile phone result in a low accuracy of the 3D information extracted through stereo matching. Therefore, our study proposes a Distortion Correction Matrix (DCM) to correct the fine distortion of close-up mobile images, pixel by pixel. We verified the correction performance by analyzing the results of correspondence point matching in the stereo image corrected using the DCM. We also confirmed the correction results of the image taken at the five different working distances and derived a linear regression model for the relationship between the angle of the image and the distortion ratio. The proposed DCM considers the distortion degree, which appears to be different in the left and right regions of the image. Finally, we performed a fine distortion correction, which is difficult to check with the naked eye. The results of this study can enable the accurate and precise 3D analysis of the skin surface using corrected mobile images.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:1424-8220
1424-8220
DOI:10.3390/s20164492