Concrete crack detection with handwriting script interferences using faster region‐based convolutional neural network

The current bridge maintenance practice generally involves manual visual inspection, which is highly subjective and unreliable. A technique that can automatically detect defects, for example, surface cracks, is essential so that early warnings can be triggered to prevent disaster due to structural f...

Full description

Saved in:
Bibliographic Details
Published inComputer-aided civil and infrastructure engineering Vol. 35; no. 4; pp. 373 - 388
Main Authors Deng, Jianghua, Lu, Ye, Lee, Vincent Cheng‐Siong
Format Journal Article
LanguageEnglish
Published Hoboken Wiley Subscription Services, Inc 01.04.2020
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:The current bridge maintenance practice generally involves manual visual inspection, which is highly subjective and unreliable. A technique that can automatically detect defects, for example, surface cracks, is essential so that early warnings can be triggered to prevent disaster due to structural failure. In this study, to permit automatic identification of concrete cracks, an ad‐hoc faster region‐based convolutional neural network (faster R‐CNN) was applied to contaminated real‐world images taken from concrete bridges with complex backgrounds, including handwriting. A dataset of 5,009 cropped images was generated and labeled for two different objects, cracks and handwriting. The proposed network was then trained and tested using the generated image dataset. Four full‐scale images that contained complex disturbance information were used to assess the performance of the trained network. The results of this study demonstrate that faster R‐CNN can automatically locate crack from raw images, even with the presence of handwriting scripts. For comparative study, the proposed network is also compared with You Only Look Once v2 detection technique.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1093-9687
1467-8667
DOI:10.1111/mice.12497