Extraction of Street Pole-Like Objects Based on Plane Filtering From Mobile LiDAR Data
Pole-like objects provide important street infra- structure for road inventory and road mapping. In this article, we proposed a novel pole-like object extraction algorithm based on plane filtering from mobile Light Detection and Ranging (LiDAR) data. The proposed approach is composed of two parts. I...
Saved in:
Published in | IEEE transactions on geoscience and remote sensing Vol. 59; no. 1; pp. 749 - 768 |
---|---|
Main Authors | , , , , |
Format | Journal Article |
Language | English |
Published |
New York
IEEE
01.01.2021
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Pole-like objects provide important street infra- structure for road inventory and road mapping. In this article, we proposed a novel pole-like object extraction algorithm based on plane filtering from mobile Light Detection and Ranging (LiDAR) data. The proposed approach is composed of two parts. In the first part, a novel octree-based split scheme was proposed to fit initial planes from off-ground points. The results of the plane fitting contribute to the extraction of pole-like objects. In the second part, we proposed a novel method of pole-like object extraction by plane filtering based on local geometric feature restriction and isolation detection. The proposed approach is a new solution for detecting pole-like objects from mobile LiDAR data. The innovation in this article is that we assumed that each of the pole-like objects can be represented by a plane. Thus, the essence of extracting pole-like objects will be converted to plane selecting problem. The proposed method has been tested on three data sets captured from different scenes. The average completeness, correctness, and quality of our approach can reach up to 87.66%, 88.81%, and 79.03%, which is superior to state-of-the-art approaches. The experimental results indicate that our approach can extract pole-like objects robustly and efficiently. |
---|---|
ISSN: | 0196-2892 1558-0644 |
DOI: | 10.1109/TGRS.2020.2993454 |