R2L-SLAM: Sensor Fusion-Driven SLAM Using mmWave Radar, LiDAR and Deep Neural Networks

Optical sensing modalities are extensively used in autonomous vehicles (AVs). These sensors are, however, not always reliable, particularly in harsh or difficult sensing conditions, such as with smoke or rain. This limitation can impact their application potential due to safety concerns, since optic...

Full description

Saved in:
Bibliographic Details
Published in2023 IEEE SENSORS pp. 1 - 4
Main Authors Balemans, Niels, Hooft, Lucas, Reiter, Philippe, Anwar, Ali, Steckel, Jan, Mercelis, Siegfried
Format Conference Proceeding
LanguageEnglish
Published IEEE 29.10.2023
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Optical sensing modalities are extensively used in autonomous vehicles (AVs). These sensors are, however, not always reliable, particularly in harsh or difficult sensing conditions, such as with smoke or rain. This limitation can impact their application potential due to safety concerns, since optical sensors can fail to reliably perceive obstacles in such harsh conditions. To address this, it would be desirable to include other modalities, such as radar, into the perception sensor suites of these AVs. However, this is difficult because many recent state-of-the-art navigation algorithms are designed specifically for LiDAR sensors. In this work, we propose a modality prediction method that allows for the addition of a single-chip mmWave radar sensor to an existing sensor setup consisting of a 2D LiDAR sensor, without changing the current downstream applications. We demonstrate the increased reliability of our method in situations where optical sensing modalities become less accurate and unreliable.
ISSN:2168-9229
DOI:10.1109/SENSORS56945.2023.10324990