Super-resolved time-of-flight sensing via FRI sampling theory

Optical time-of-flight (ToF) sensors can measure scene depth accurately by projection and reception of an optical signal. The range to a surface in the path of the emitted signal is proportional to the delay time of the light echo or the reflected signal. In practice, a diverging beam may be subject...

Full description

Saved in:
Bibliographic Details
Published inProceedings of the ... IEEE International Conference on Acoustics, Speech and Signal Processing (1998) pp. 4009 - 4013
Main Authors Bhandari, Ayush, Wallace, Andrew M., Raskar, Ramesh
Format Conference Proceeding Journal Article
LanguageEnglish
Published IEEE 01.03.2016
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Optical time-of-flight (ToF) sensors can measure scene depth accurately by projection and reception of an optical signal. The range to a surface in the path of the emitted signal is proportional to the delay time of the light echo or the reflected signal. In practice, a diverging beam may be subject to multi-echo backscatter, and all these echoes must be resolved to estimate the multiple depths. In this paper, we propose a method for super-resolution of optical ToF signals. Our contributions are twofold. Starting with a general image formation model common to most ToF sensors, we draw a striking analogy of ToF systems with sampling theory. Based on our model, we reformulate the ToF super-resolution problem as a parameter estimation problem pivoted around the finite-rate-of-innovation framework. In particular, we show that super-resolution of multi-echo backscattered signal amounts to recovery of Dirac impulses from low-pass measurements. Our theory is corroborated by analysis of data collected from a photon counting, LiDAR sensor, showing the effectiveness of our non-iterative and computationally efficient algorithm.
Bibliography:ObjectType-Article-2
SourceType-Scholarly Journals-1
ObjectType-Conference-1
ObjectType-Feature-3
content type line 23
SourceType-Conference Papers & Proceedings-2
ISSN:2379-190X
DOI:10.1109/ICASSP.2016.7472430