RetinaNet With Difference Channel Attention and Adaptively Spatial Feature Fusion for Steel Surface Defect Detection

Surface defect detection of products is an important process to guarantee the quality of industrial production. A defect detection task aims to identify the specific category and precise position of defect in an image. It is hard to take into account the accuracy of both, which makes it be challengi...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on instrumentation and measurement Vol. 70; pp. 1 - 11
Main Authors Cheng, Xun, Yu, Jianbo
Format Journal Article
LanguageEnglish
Published New York IEEE 2021
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text
ISSN0018-9456
1557-9662
DOI10.1109/TIM.2020.3040485

Cover

Loading…
More Information
Summary:Surface defect detection of products is an important process to guarantee the quality of industrial production. A defect detection task aims to identify the specific category and precise position of defect in an image. It is hard to take into account the accuracy of both, which makes it be challenging in practice. In this study, a new deep neural network (DNN), RetinaNet with difference channel attention and adaptively spatial feature fusion (DEA_RetinaNet), is proposed for steel surface defect detection. First, a differential evolution search-based anchor optimization is performed to improve the detection accuracy of DEA_RetinaNet. Second, a novel channel attention mechanism is embedded in DEA_RetinaNet to reduce information loss. Finally, the adaptive spatial feature fusion (ASFF) module is used for an effective fusion of shallow and deep features extracted by convolutional kernels. The experimental results on a steel surface defect data set (NEU-DET) show that DEA_RetinaNet achieved 78.25 mAP and improved by 2.92% over RetinaNet. It has better recognition performance compared with other famous DNN-based detectors.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0018-9456
1557-9662
DOI:10.1109/TIM.2020.3040485