Unsupervised reconstruction of Building Information Modeling wall objects from point cloud data

Scan-to-BIM of existing buildings is in high demand by the construction industry. However, these models are costly and time-consuming to create. The automation of this process is still subject of ongoing research. Current obstacles include the interpretation and reconstruction of raw point cloud dat...

Full description

Saved in:
Bibliographic Details
Published inAutomation in construction Vol. 120; p. 103338
Main Authors Bassier, Maarten, Vergauwen, Maarten
Format Journal Article
LanguageEnglish
Published Amsterdam Elsevier B.V 01.12.2020
Elsevier BV
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Scan-to-BIM of existing buildings is in high demand by the construction industry. However, these models are costly and time-consuming to create. The automation of this process is still subject of ongoing research. Current obstacles include the interpretation and reconstruction of raw point cloud data, which is complicated by the complexity of built structures, the vast amount of data to be processed and the variety of objects in the built environment. This research aims to overcome the current obstacles and reconstruct the structure of buildings in an unsupervised manner. More specifically, a novel method is presented to automatically reconstruct BIM wall objects and their topology. Key contributions of the method are the ability to reconstruct different wall axis and connection types and the simultaneous processing of entire multi-story structures. The method is validated with the Stanford 2D–3D-Semantics Dataset (2D–3D-S). •Unordered point clouds of entire buildings are processed fully automatically in 3D.•The wall observations are retrieved through machine learning techniques.•The method outperforms the manual wall modeling (LOA validation).•The method reconstructs lines, arcs and polylines and their best fit connections.•The method is made Open Source with additional samples.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0926-5805
1872-7891
DOI:10.1016/j.autcon.2020.103338