A Bidirectional Self-Rectifying Network With Bayesian Modeling for Vision-Based Crack Detection

Robotic vision is increasingly applied for surface inspection of built infrastructure. For this, it is essential to develop robust algorithms for semantic segmentation. This article presents a deep learning approach using a bidirectional self-rectifying network with Bayesian modeling (BSNBM) for imp...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on industrial informatics Vol. 19; no. 3; pp. 3017 - 3028
Main Authors Zhu, Qiuchen, Ha, Quang Phuc
Format Journal Article
LanguageEnglish
Published Piscataway IEEE 01.03.2023
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Robotic vision is increasingly applied for surface inspection of built infrastructure. For this, it is essential to develop robust algorithms for semantic segmentation. This article presents a deep learning approach using a bidirectional self-rectifying network with Bayesian modeling (BSNBM) for improving detection accuracy, in dealing with the embedded uncertainty caused by false-positive labels and nonlinearity in sequentially convolutional blocks. For integration with residual encoders, a feature preserving branch is designed, wherein the output of previous dilated convolutional blocks is upsizedly or downsizedly passed on and concatenated with the following blocks recursively and bidirectionally. Further, to achieve robustness in feature representation with an acceptable level of credibility, convolutional kernels are randomized via a Bayesian model and adjusted per evidence update. As such, the network becomes less sensitive to uncertainty and redundant nonlinearity, which is inevitable in activation layers. Experimental results confirm the advantage of our BSNBM over current crack detection approaches.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1551-3203
1941-0050
DOI:10.1109/TII.2022.3172995