Deep Learning Based Semantic Segmentation for BIM Model Generation from RGB-D Sensors

RGB-D sensors offer a low-cost and promising solution to streamline the generation of BIM models. This paper introduces a framework designed to automate the creation of detailed and semantically rich BIM models from RGB-D data in indoor environments. The framework leverages advanced computer vision...

Full description

Saved in:
Bibliographic Details
Published inISPRS annals of the photogrammetry, remote sensing and spatial information sciences Vol. X-4/W5-2024; pp. 271 - 279
Main Authors Rached, Ishraq, Hajji, Rafika, Landes, Tania, Haffadi, Rashid
Format Journal Article
LanguageEnglish
Published Gottingen Copernicus GmbH 27.06.2024
Copernicus Publications
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:RGB-D sensors offer a low-cost and promising solution to streamline the generation of BIM models. This paper introduces a framework designed to automate the creation of detailed and semantically rich BIM models from RGB-D data in indoor environments. The framework leverages advanced computer vision and deep learning techniques to overcome the challenges associated with traditional, labour-intensive BIM modeling methods. The results show that the proposed method is robust and accurate, compared to the high-quality statistic laser scanning TLS. Indeed, 58% of the distances measured between the calculated and the reference point cloud produced by TLS were under 5 cm, and 82% of distances were smaller than 7 cm. Furthermore, the framework achieves 100% accuracy in element extraction. Beyond its accuracy, the proposed framework significantly enhances efficiency in both data acquisition and processing. In contrast to the time-consuming process associated with TLS, our approach remarkably reduces the data collection and processing time by factor of height. This highlights the framework’s substantial improvements in accuracy and efficiency throughout the BIM generation workflows, making it a streamlined and time-effective solution.
ISSN:2194-9050
2194-9042
2194-9050
DOI:10.5194/isprs-annals-X-4-W5-2024-271-2024