Fusion Domain-Adaptation CNN Driven by Images and Vibration Signals for Fault Diagnosis of Gearbox Cross-Working Conditions

The vibration signal of gearboxes contains abundant fault information, which can be used for condition monitoring. However, vibration signal is ineffective for some non-structural failures. In order to resolve this dilemma, infrared thermal images are introduced to combine with vibration signals via...

Full description

Saved in:
Bibliographic Details
Published inEntropy (Basel, Switzerland) Vol. 24; no. 1; p. 119
Main Authors Mao, Gang, Zhang, Zhongzheng, Qiao, Bin, Li, Yongbo
Format Journal Article
LanguageEnglish
Published Switzerland MDPI AG 13.01.2022
MDPI
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:The vibration signal of gearboxes contains abundant fault information, which can be used for condition monitoring. However, vibration signal is ineffective for some non-structural failures. In order to resolve this dilemma, infrared thermal images are introduced to combine with vibration signals via fusion domain-adaptation convolutional neural network (FDACNN), which can diagnose both structural and non-structural failures under various working conditions. First, the measured raw signals are converted into frequency and squared envelope spectrum to characterize the health states of the gearbox. Second, the sequences of the frequency and squared envelope spectrum are arranged into two-dimensional format, which are combined with infrared thermal images to form fusion data. Finally, the adversarial network is introduced to realize the state recognition of structural and non-structural faults in the unlabeled target domain. An experiment of gearbox test rigs was used for effectiveness validation by measuring both vibration and infrared thermal images. The results suggest that the proposed FDACNN method performs best in cross-domain fault diagnosis of gearboxes via multi-source heterogeneous data compared with the other four methods.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:1099-4300
1099-4300
DOI:10.3390/e24010119