Multi-information PointNet++ fusion method for DEM construction from airborne LiDAR data

Airborne light detection and ranging (LiDAR) is a popular technology in remote sensing that can significantly improve the efficiency of digital elevation model (DEM) construction. However, it is challenging to identify the real terrain features in complex areas using LiDAR data. To solve this proble...

Full description

Saved in:
Bibliographic Details
Published inGeocarto international Vol. 38; no. 1
Main Authors Hu, Hong, Zhang, Guanghe, Ao, Jianfeng, Wang, Chunlin, Kang, Ruihong, Wu, Yanlan
Format Journal Article
LanguageEnglish
Published Taylor & Francis 31.12.2023
Taylor & Francis Group
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Airborne light detection and ranging (LiDAR) is a popular technology in remote sensing that can significantly improve the efficiency of digital elevation model (DEM) construction. However, it is challenging to identify the real terrain features in complex areas using LiDAR data. To solve this problem, this work proposes a multi-information fusion method based on PointNet++ to improve the accuracy of DEM construction. The RGB data and normalized coordinate information of the point cloud was added to increase the number of channels on the input side of the PointNet++ neural network, which can improve the accuracy of the classification during feature extraction. Low and high density point clouds obtained from the International Society for Photogrammetry and Remote Sensing (ISPRS) and the United States Geological Survey (USGS) were used to test this proposed method. The results suggest that the proposed method improves the Kappa coefficient by 8.81% compared to PointNet++. The type I error was reduced by 2.13%, the type II error was reduced by 8.29%, and the total error was reduced by 2.52% compared to the conventional algorithm. Therefore, it is possible to conclude that the proposed method can obtain DEMs with higher accuracy.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:1010-6049
1752-0762
1752-0762
DOI:10.1080/10106049.2022.2153929