Local Descriptor for Robust Place Recognition Using LiDAR Intensity

Place recognition is a challenging problem in mobile robotics, especially in unstructured environments or under viewpoint and illumination changes. Most LiDAR-based methods rely on geometrical features to overcome such challenges, as generally scene geometry is invariant to these changes, but tend t...

Full description

Saved in:
Bibliographic Details
Published inIEEE robotics and automation letters Vol. 4; no. 2; pp. 1470 - 1477
Main Authors Guo, Jiadong, Borges, Paulo V. K., Park, Chanoh, Gawel, Abel
Format Journal Article
LanguageEnglish
Published Piscataway IEEE 01.04.2019
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Place recognition is a challenging problem in mobile robotics, especially in unstructured environments or under viewpoint and illumination changes. Most LiDAR-based methods rely on geometrical features to overcome such challenges, as generally scene geometry is invariant to these changes, but tend to affect camera-based solutions significantly. Compared to cameras, however, LiDARs lack the strong and descriptive appearance information that imaging can provide. To combine the benefits of geometry and appearance, we propose coupling the conventional geometric information from the LiDAR with its calibrated intensity return. This strategy extracts extremely useful information in the form of a new descriptor design, coined ISHOT, outperforming popular state-of-the-art geometric-only descriptors by significant margin in our local descriptor evaluation. To complete the framework, we furthermore develop a probabilistic keypoint voting place recognition algorithm, leveraging the new descriptor and yielding sublinear place recognition performance. The efficacy of our approach is validated in challenging global localization experiments in large-scale built-up and unstructured environments.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:2377-3766
2377-3766
DOI:10.1109/LRA.2019.2893887