Multi-Modal Dynamic Fusion for Defect Detection in Electronic Products: A Novel Approach Based on Energy and Deep Learning

As electronic products continue to evolve in complexity, maintaining stringent quality standards during manufacturing presents mounting challenges. Conventional defect detection approaches, which typically depend on a single modality, often fall short in both efficiency and reliability. To address t...

Full description

Saved in:
Bibliographic Details
Published inIEEE access Vol. 13; pp. 118565 - 118573
Main Authors Liu, Yulin, Gao, Yang
Format Journal Article
LanguageEnglish
Published Piscataway IEEE 2025
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:As electronic products continue to evolve in complexity, maintaining stringent quality standards during manufacturing presents mounting challenges. Conventional defect detection approaches, which typically depend on a single modality, often fall short in both efficiency and reliability. To address these shortcomings, this study introduces a dynamic multi-modal fusion framework that leverages data from sensors, visual imagery, and component attributes to enhance detection performance. Specifically, Transformer architectures are employed for sensor data analysis, Convolutional Neural Networks (CNNs) are applied to process image data, and Multi-Layer Perceptrons (MLPs) are used to represent part-level features. A distinguishing element of this approach is an energy-based late-stage fusion mechanism that adaptively modulates each modality's influence according to its uncertainty level. Empirical evaluations demonstrate that the proposed model achieves superior results across multiple performance metrics-including accuracy, precision, recall, and F1 score-compared to conventional and unimodal systems. These findings underscore the model's potential in advancing practical defect detection and quality assurance in manufacturing environments.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2025.3584551