SHREC 2021: 3D point cloud change detection for street scenes

The rapid development of 3D acquisition devices enables us to collect billions of points in a few hours. However, the analysis of the output data is a challenging task, especially in the field of 3D point cloud change detection. In this Shape Retrieval Challenge (SHREC) track, we provide a street-sc...

Full description

Saved in:
Bibliographic Details
Published inComputers & graphics Vol. 99; p. 192
Main Authors Ku, Tao, Galanakis, Sam, Boom, Bas, Veltkamp, Remco C, Bangera, Darshan, Gangisetty, Shankar, Stagakis, Nikolaos, Arvanitis, Gerasimos, Moustakas, Konstantinos
Format Journal Article
LanguageEnglish
Published Oxford Elsevier Science Ltd 01.10.2021
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:The rapid development of 3D acquisition devices enables us to collect billions of points in a few hours. However, the analysis of the output data is a challenging task, especially in the field of 3D point cloud change detection. In this Shape Retrieval Challenge (SHREC) track, we provide a street-scene dataset for 3D point cloud change detection. The dataset consists of 866 3D object pairs in year 2016 and 2020 from 78 large-scale street scene 3D point clouds. Our goal is to detect the changes from multi-temporal point clouds in a complex street environment. We compare three methods on this benchmark, with one handcrafted (PoChaDeHH) and the other two learning-based (HGI-CD and SiamGCN). The results show that the handcrafted algorithm has balanced performance over all classes, while learning-based methods achieve overwhelming performance but suffer from the class-imbalanced problem and may fail on minority classes. The randomized oversampling metric applied in SiamGCN can alleviate this problem. Also, different siamese network architecture in HGI-CD and SiamGCN contribute to the designing of a network for the 3D change detection task.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0097-8493
1873-7684
DOI:10.1016/j.cag.2021.07.004