Efficient Viewer-Centric Depth Adjustment Based on Virtual Fronto-Parallel Planar Projection in Stereo 3D Images

This paper presents an efficient method for adjusting the 3D depth of an object including as much as a whole scene in stereo 3D images by utilizing a virtual fronto-parallel planar projection in the 3D space perceived by a viewer. The proposed method just needs to establish object correspondence ins...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on multimedia Vol. 16; no. 2; pp. 326 - 336
Main Authors Park, Hanje, Lee, Hoonjae, Sull, Sanghoon
Format Journal Article
LanguageEnglish
Published New York, NY IEEE 01.02.2014
Institute of Electrical and Electronics Engineers
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:This paper presents an efficient method for adjusting the 3D depth of an object including as much as a whole scene in stereo 3D images by utilizing a virtual fronto-parallel planar projection in the 3D space perceived by a viewer. The proposed method just needs to establish object correspondence instead of the accurate estimation of the disparity field or point correspondence. We simulate the depth adjustment of a 3D point perceived by a viewer through a corresponding pair of points on the stereo 3D images by moving the virtual fronto-parallel plane on which the left and right points are projected. We show that the resulting transformation of image coordinates of the points can be simply expressed by three values of a scale factor and two translations that depend on one parameter for the depth adjustment. The experimental results demonstrate the feasibility of the proposed approach that yields less visual fatigue and smaller 3D shape distortion than the conventional parallax adjustment method. The overall procedure can be efficiently applied to each frame of a stereo video without causing any artifact.
Bibliography:ObjectType-Article-2
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 23
ISSN:1520-9210
1941-0077
DOI:10.1109/TMM.2013.2286567