Multi‐Event‐Camera Depth Estimation and Outlier Rejection by Refocused Events Fusion
Event cameras are bio‐inspired sensors that offer advantages over traditional cameras. They operate asynchronously, sampling the scene at microsecond resolution and producing a stream of brightness changes. This unconventional output has sparked novel computer vision methods to unlock the camera...
Saved in:
Published in | Advanced intelligent systems Vol. 4; no. 12 |
---|---|
Main Authors | , |
Format | Journal Article |
Language | English |
Published |
Weinheim
John Wiley & Sons, Inc
01.12.2022
Wiley |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Event cameras are bio‐inspired sensors that offer advantages over traditional cameras. They operate asynchronously, sampling the scene at microsecond resolution and producing a stream of brightness changes. This unconventional output has sparked novel computer vision methods to unlock the camera's potential. Here, the problem of event‐based stereo 3D reconstruction for SLAM is considered. Most event‐based stereo methods attempt to exploit the high temporal resolution of the camera and the simultaneity of events across cameras to establish matches and estimate depth. By contrast, this work investigates how to estimate depth without explicit data association by fusing disparity space images (DSIs) originated in efficient monocular methods. Fusion theory is developed and applied to design multi‐camera 3D reconstruction algorithms that produce state‐of‐the‐art results, as confirmed by comparisons with four baseline methods and tests on a variety of available datasets.
Event cameras are novel bioinspired sensors offering advantages over traditional cameras. They mimic human eyes, naturally responding to moving patterns in the scene asynchronously. How can a robot with two or more event‐camera eyes map its 3D environment? A stereo correspondence‐free approach that fuses volumetric event‐data representations early in the system is presented: https://github.com/tub-rip/dvs_mcemvs |
---|---|
ISSN: | 2640-4567 2640-4567 |
DOI: | 10.1002/aisy.202200221 |