An 80×60 Flash LiDAR Sensor with In-Pixel Histogramming TDC Based on Quaternary Search and Time-Gated Δ-Intensity Phase Detection for 45m Detectable Range and Background Light Cancellation

Light detection and ranging (LiDAR) sensors have become one of the key building blocks to realize metaverse applications with VR/AR in mobile devices and level-5 automotive vehicles. In particular, SPAD-based direct time-of-flight (D-ToF) sensors have emerged as LiDAR sensors because they offer a lo...

Full description

Saved in:
Bibliographic Details
Published in2022 IEEE International Solid- State Circuits Conference (ISSCC) Vol. 65; pp. 98 - 100
Main Authors Park, Seonghyeok, Kim, Bumjun, Cho, Junhee, Chun, Jung-Hoon, Choi, Jaehyuk, Kim, Seong-Jin
Format Conference Proceeding
LanguageEnglish
Published IEEE 20.02.2022
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Light detection and ranging (LiDAR) sensors have become one of the key building blocks to realize metaverse applications with VR/AR in mobile devices and level-5 automotive vehicles. In particular, SPAD-based direct time-of-flight (D-ToF) sensors have emerged as LiDAR sensors because they offer a longer maximum detectable range and higher background light immunity than indirect time-of-flight (I-ToF) sensors with photon-mixing devices [1]. However, their complicated front- and back-end blocks to resolve ToF values as short as 100ps require high-resolution TDCs and several memories, limiting the spatial resolution and the depth accuracy in short ranges. To address this issue, alternative architectures combining both D-ToF and I-ToF techniques have been reported [2, 3]. Direct-indirect-mixed frame synthesis provides accurate depth information by detecting phases in short ranges while creating a sparse depth map with counting photons in long ranges [2]. A two-step histogramming TDC is used in [3] where a coarse D-ToF discriminates distance roughly and a fine I-ToF extracts depth precisely. However, these approaches still suffer from limited depth accuracy [2] or low spatial resolution [3].
ISSN:2376-8606
DOI:10.1109/ISSCC42614.2022.9731112