BARNet: Boundary Aware Refinement Network for Crack Detection

Road crack is one of the prominent problems that can frequently occur in highways and main roads. The manual road crack evaluation is laborious, time-consuming, inaccurate, and it has several implementation issues. Conversely, the computer vision-based solution is very challenging due to the complex...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on intelligent transportation systems Vol. 23; no. 7; pp. 7343 - 7358
Main Authors Guo, Jing-Ming, Markoni, Herleeyandi, Lee, Jiann-Der
Format Journal Article
LanguageEnglish
Published New York IEEE 01.07.2022
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Road crack is one of the prominent problems that can frequently occur in highways and main roads. The manual road crack evaluation is laborious, time-consuming, inaccurate, and it has several implementation issues. Conversely, the computer vision-based solution is very challenging due to the complex ambient conditions, including illumination, shadow, dust, and crack shape. Most of the cracks exist as irregular edge patterns and are the most important features for detection purpose. Recent advances in deep learning adopt a convolutional neural network as the base model to detect and localize crack with a single RGB image. Yet, this approach has an inaccurate boundary for crack localization, resulting in thicker and blurry edges. To overcome this problem, the study proposes a novel and robust road crack detection based on deep learning which also considers the original edge of the image as the additional feature. The main contribution of this work is adapting the original image gradient with the coarse crack detection result and refining it to produce more precise crack boundaries. Extensive experimental results have shown that the proposed method outperforms the former state-of-the-art methods in terms of the detection accuracy.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1524-9050
1558-0016
DOI:10.1109/TITS.2021.3069135