Accurate depth estimation in structured light fields
Passive light field imaging generally uses depth cues that depend on the image structure to perform depth estimation, causing robustness and accuracy problems in complex scenes. In this study, the commonly used depth cues, defocus and correspondence, were analyzed by using phase encoding instead of...
Saved in:
Published in | Optics express Vol. 27; no. 9; pp. 13532 - 13546 |
---|---|
Main Authors | , , , , |
Format | Journal Article |
Language | English |
Published |
United States
29.04.2019
|
Online Access | Get full text |
Cover
Loading…
Summary: | Passive light field imaging generally uses depth cues that depend on the image structure to perform depth estimation, causing robustness and accuracy problems in complex scenes. In this study, the commonly used depth cues, defocus and correspondence, were analyzed by using phase encoding instead of the image structure. The defocus cue obtained by spatial variance is insensitive to the global spatial monotonicity of the phase-encoded field. In contrast, the correspondence cue is sensitive to the angular variance of the phase-encoded field, and the correspondence responses across the depth range have single-peak distributions. Based on this analysis, a novel active light field depth estimation method is proposed by directly using the correspondence cue in the structured light field to search for non-ambiguous depths, and thus no optimization is required. Furthermore, the angular variance can be weighted to reduce the depth estimation uncertainty according to the phase encoding information. The depth estimation of an experimental scene with rich colors demonstrated that the proposed method could distinguish different depth regions in each color segment more clearly, and was substantially improved in terms of phase consistency compared to the passive method, thus verifying its robustness and accuracy. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 23 |
ISSN: | 1094-4087 1094-4087 |
DOI: | 10.1364/OE.27.013532 |