A Structural Damage Assessment Model Based on Deep Learning

The rapid and accurate damage assessment of the buildings in the residential areas after the earthquake is an important issue for the reconstruction of the damaged cities. This study aims to automatically classify the damage conditions of structural load-bearing elements such as columns and beams wi...

Full description

Saved in:
Bibliographic Details
Published in2023 31st Signal Processing and Communications Applications Conference (SIU) pp. 1 - 4
Main Authors Taskin, G., Kaya, H., Turan, O. T., Cinar, T., Ilki, A.
Format Conference Proceeding
LanguageEnglish
Published IEEE 05.07.2023
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:The rapid and accurate damage assessment of the buildings in the residential areas after the earthquake is an important issue for the reconstruction of the damaged cities. This study aims to automatically classify the damage conditions of structural load-bearing elements such as columns and beams with deep learning models through photographs taken from the buildings after the earthquake. ConvNeXt model, which is a new generation deep convolutional neural network, has been used as a deep learning method. Thanks to the trained model, it is possible to distinguish between two types of structural and nonstructural damage classes. In addition, the position of cracks on the existing structural element, which has an important place for engineers in damage detection, can also be determined. As the training dataset, reinforced concrete building images taken after the Elazi earthquake and labeled by experts were used. In the g current ConvNeXt model, a reliable model with high accuracy has been obtained using different transfer learning strategies and fine-tuning, data augmentation, and regularization techniques. The results obtained were compared with other convolutional neural network methods accepted in the literature, and it was observed that the ConvNeXt-based method produced faster and more accurate results.
DOI:10.1109/SIU59756.2023.10223945