Surface Defect Detection Method Based on Improved Semisupervised Multitask Generative Adversarial Network
The detection methods based on deep learning networks have attracted widespread interest in industrial manufacture. However, the existing methods are mainly trapped by a large amount of training data with excellent labels and also show difficulty for the simultaneous detection of multiple defects in...
Saved in:
Published in | Scientific programming Vol. 2022; pp. 1 - 17 |
---|---|
Main Authors | , , , , , , |
Format | Journal Article |
Language | English |
Published |
New York
Hindawi
19.01.2022
Hindawi Limited |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | The detection methods based on deep learning networks have attracted widespread interest in industrial manufacture. However, the existing methods are mainly trapped by a large amount of training data with excellent labels and also show difficulty for the simultaneous detection of multiple defects in practical detection. Therefore, in this article, a defect detection method based on improved semisupervised multitask generative adversarial network (iSSMT-GAN) is proposed for generating better image features and improving classification accuracy. First, the training data are manually labeled according to the types of defects, and the generative adversarial network (GAN) is constructed according to the reliable annotations about defects. Thus, a classification decision surface for the detection of multitype defects is formed in the discriminative network of GAN in an integrated manner. Moreover, the semisupervised samples generated by the discriminative network give the generative network feedback for enhancing the image features and avoiding gradient disappearance or overfitting. Finally, the experimental results show that the proposed method can generate high-quality image features compared with the classic GAN. Furthermore, this increase in classification accuracy of RegNet model, MobileNet v3 model, VGG-19 model, and AlexNet-based transfer learning is 3.13%, 2.30%, 2.48%, and 3.12%, respectively. |
---|---|
ISSN: | 1058-9244 1875-919X |
DOI: | 10.1155/2022/4481495 |