Transient Neural Radiance Fields for Lidar View Synthesis and 3D Reconstruction
Neural radiance fields (NeRFs) have become a ubiquitous tool for modeling scene appearance and geometry from multiview imagery. Recent work has also begun to explore how to use additional supervision from lidar or depth sensor measurements in the NeRF framework. However, previous lidar-supervised Ne...
Saved in:
Main Authors | , , , , |
---|---|
Format | Journal Article |
Language | English |
Published |
14.07.2023
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Neural radiance fields (NeRFs) have become a ubiquitous tool for modeling
scene appearance and geometry from multiview imagery. Recent work has also
begun to explore how to use additional supervision from lidar or depth sensor
measurements in the NeRF framework. However, previous lidar-supervised NeRFs
focus on rendering conventional camera imagery and use lidar-derived point
cloud data as auxiliary supervision; thus, they fail to incorporate the
underlying image formation model of the lidar. Here, we propose a novel method
for rendering transient NeRFs that take as input the raw, time-resolved photon
count histograms measured by a single-photon lidar system, and we seek to
render such histograms from novel views. Different from conventional NeRFs, the
approach relies on a time-resolved version of the volume rendering equation to
render the lidar measurements and capture transient light transport phenomena
at picosecond timescales. We evaluate our method on a first-of-its-kind dataset
of simulated and captured transient multiview scans from a prototype
single-photon lidar. Overall, our work brings NeRFs to a new dimension of
imaging at transient timescales, newly enabling rendering of transient imagery
from novel views. Additionally, we show that our approach recovers improved
geometry and conventional appearance compared to point cloud-based supervision
when training on few input viewpoints. Transient NeRFs may be especially useful
for applications which seek to simulate raw lidar measurements for downstream
tasks in autonomous driving, robotics, and remote sensing. |
---|---|
DOI: | 10.48550/arxiv.2307.09555 |