Self-supervised 4-D Radar Odometry for Autonomous Vehicles
Reliable ego-motion estimation is a crucial technology for autonomous vehicles. While progress has been made in deep odometry systems utilizing cameras and LiDAR, there is significant potential in exploring 4-D radar odometry due to radar's robustness against adverse weather and lighting condit...
Saved in:
Published in | 2023 IEEE 26th International Conference on Intelligent Transportation Systems (ITSC) pp. 764 - 769 |
---|---|
Main Authors | , , |
Format | Conference Proceeding |
Language | English |
Published |
IEEE
24.09.2023
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Reliable ego-motion estimation is a crucial technology for autonomous vehicles. While progress has been made in deep odometry systems utilizing cameras and LiDAR, there is significant potential in exploring 4-D radar odometry due to radar's robustness against adverse weather and lighting conditions. Nevertheless, radar-based odometry faces several challenges: 1) radar point clouds are sparser and noisier than LiDAR point clouds; 2) radar points belonging to moving objects will cause interference to deep odometry; 3) the dependence on massive labeled data limits the practical application of supervised learning-based radar odometry. To address these challenges, this work proposes a self-supervised 4-D radar odometry. Specifically, we employ a multi-scale approach to extract robust features from sparse point clouds. Besides introducing several traditional LiDAR-based loss functions, we design a novel velocity-aware loss based on radar characteristics to achieve a self-supervised radar odometry. Moreover, we develop a point confidence estimation module to reduce the interference of moving objects and noise. We conduct comprehensive experiments on a public dataset to demonstrate the advanced performance of our method. |
---|---|
ISSN: | 2153-0017 |
DOI: | 10.1109/ITSC57777.2023.10422466 |