Tracking Method of GM-APD LiDAR Based on Adaptive Fusion of Intensity Image and Point Cloud
The target is often obstructed by obstacles with the dynamic tracking scene, leading to a loss of target information and a decrease in tracking accuracy or even complete failure. To address these challenges, we leverage the capabilities of Geiger-mode Avalanche Photodiode (GM-APD) LiDAR to acquire b...
Saved in:
Published in | Applied sciences Vol. 14; no. 17; p. 7884 |
---|---|
Main Authors | , , , , , , , |
Format | Journal Article |
Language | English |
Published |
Basel
MDPI AG
01.09.2024
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | The target is often obstructed by obstacles with the dynamic tracking scene, leading to a loss of target information and a decrease in tracking accuracy or even complete failure. To address these challenges, we leverage the capabilities of Geiger-mode Avalanche Photodiode (GM-APD) LiDAR to acquire both intensity images and point cloud data for researching a target tracking method that combines the fusion of intensity images and point cloud data. Building upon Kernelized correlation filtering (KCF), we introduce Fourier descriptors based on intensity images to enhance the representational capacity of target features, thereby achieving precise target tracking using intensity images. Additionally, an adaptive factor is designed based on peak sidelobe ratio and intrinsic shape signature to accurately detect occlusions. Finally, by fusing the tracking results from Kalman filter and KCF with adaptive factors following occlusion detection, we obtain location information for the central point of the target. The proposed method is validated through simulations using the KITTI tracking dataset, yielding an average position error of 0.1182m for the central point of the target. Moreover, our approach achieves an average tracking accuracy that is 21.67% higher than that obtained by Kalman filtering algorithm and 7.94% higher than extended Kalman filtering algorithm on average. |
---|---|
ISSN: | 2076-3417 2076-3417 |
DOI: | 10.3390/app14177884 |