Self-supervised 4-D Radar Odometry for Autonomous Vehicles

Reliable ego-motion estimation is a crucial technology for autonomous vehicles. While progress has been made in deep odometry systems utilizing cameras and LiDAR, there is significant potential in exploring 4-D radar odometry due to radar's robustness against adverse weather and lighting condit...

Full description

Saved in:
Bibliographic Details
Published in2023 IEEE 26th International Conference on Intelligent Transportation Systems (ITSC) pp. 764 - 769
Main Authors Zhou, Huanyu, Lu, Shouyi, Zhuo, Guirong
Format Conference Proceeding
LanguageEnglish
Published IEEE 24.09.2023
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Reliable ego-motion estimation is a crucial technology for autonomous vehicles. While progress has been made in deep odometry systems utilizing cameras and LiDAR, there is significant potential in exploring 4-D radar odometry due to radar's robustness against adverse weather and lighting conditions. Nevertheless, radar-based odometry faces several challenges: 1) radar point clouds are sparser and noisier than LiDAR point clouds; 2) radar points belonging to moving objects will cause interference to deep odometry; 3) the dependence on massive labeled data limits the practical application of supervised learning-based radar odometry. To address these challenges, this work proposes a self-supervised 4-D radar odometry. Specifically, we employ a multi-scale approach to extract robust features from sparse point clouds. Besides introducing several traditional LiDAR-based loss functions, we design a novel velocity-aware loss based on radar characteristics to achieve a self-supervised radar odometry. Moreover, we develop a point confidence estimation module to reduce the interference of moving objects and noise. We conduct comprehensive experiments on a public dataset to demonstrate the advanced performance of our method.
ISSN:2153-0017
DOI:10.1109/ITSC57777.2023.10422466