A deep learning‐based method for pixel‐level crack detection on concrete bridges
Crack detection of the concrete bridge is an essential index for the safety assessment of bridge structure. It is more important to check the whole structure than to check the accuracy in the damage assessment. However, the traditional deep learning model method cannot completely detect the crack st...
Saved in:
Published in | IET image processing Vol. 16; no. 10; pp. 2609 - 2622 |
---|---|
Main Authors | , , , |
Format | Journal Article |
Language | English |
Published |
Wiley
01.08.2022
|
Online Access | Get full text |
Cover
Loading…
Summary: | Crack detection of the concrete bridge is an essential index for the safety assessment of bridge structure. It is more important to check the whole structure than to check the accuracy in the damage assessment. However, the traditional deep learning model method cannot completely detect the crack structure, which challenges image‐based crack detection. For this reason, we propose deep bridge crack classification (DBCC)‐Net as a classification‐based deep learning network. By pruning the Yolox, the regression problem of the target detection is converted to the binary classification problem to avoid the network performance degradation caused by the translation invariance of the convolutional neural network (CNN). In addition, the network post‐processing and a two‐stage crack detection strategy are proposed to enable the network to detect cracks and extract crack morphology in high‐resolution images quickly. In the first stage, DBCC‐Net realizes the coarse extraction of crack position based on image slice classification. In the second stage, the complete crack morphology is extracted from the location suggested by the semantic segmentation network. Experimental results show that the proposed two‐stage method has 19 frames per second (FPS) and 0.79 Miou (mean intersection over union) at the actual bridge images with 2560×2560 pixels. Although FPS is reduced, the Miou value is 7.8% higher than other methods, proving this paper's practical value. |
---|---|
ISSN: | 1751-9659 1751-9667 |
DOI: | 10.1049/ipr2.12512 |