An EEG-Based Transfer Learning Method for Cross-Subject Fatigue Mental State Prediction

Fatigued driving is one of the main causes of traffic accidents. The electroencephalogram (EEG)-based mental state analysis method is an effective and objective way of detecting fatigue. However, as EEG shows significant differences across subjects, effectively "transfering" the EEG analys...

Full description

Saved in:
Bibliographic Details
Published inSensors (Basel, Switzerland) Vol. 21; no. 7; p. 2369
Main Authors Zeng, Hong, Li, Xiufeng, Borghini, Gianluca, Zhao, Yue, Aricò, Pietro, Di Flumeri, Gianluca, Sciaraffa, Nicolina, Zakaria, Wael, Kong, Wanzeng, Babiloni, Fabio
Format Journal Article
LanguageEnglish
Published Switzerland MDPI AG 29.03.2021
MDPI
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Fatigued driving is one of the main causes of traffic accidents. The electroencephalogram (EEG)-based mental state analysis method is an effective and objective way of detecting fatigue. However, as EEG shows significant differences across subjects, effectively "transfering" the EEG analysis model of the existing subjects to the EEG signals of other subjects is still a challenge. Domain-Adversarial Neural Network (DANN) has excellent performance in transfer learning, especially in the fields of document analysis and image recognition, but has not been applied directly in EEG-based cross-subject fatigue detection. In this paper, we present a DANN-based model, Generative-DANN (GDANN), which combines Generative Adversarial Networks (GAN) to enhance the ability by addressing the issue of different distribution of EEG across subjects. The comparative results show that in the analysis of cross-subject tasks, GDANN has a higher average accuracy of 91.63% in fatigue detection across subjects than those of traditional classification models, which is expected to have much broader application prospects in practical brain-computer interaction (BCI).
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:1424-8220
1424-8220
DOI:10.3390/s21072369