Multi-Source Transfer Learning for EEG Classification Based on Domain Adversarial Neural Network

Electroencephalogram (EEG) classification has attracted great attention in recent years, and many models have been presented for this task. Nevertheless, EEG data vary from subject to subject, which may lead to the performance of a classifier degrades due to individual differences. To collect enough...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on neural systems and rehabilitation engineering Vol. 31; pp. 218 - 228
Main Authors Liu, Dezheng, Zhang, Jia, Wu, Hanrui, Liu, Siwei, Long, Jinyi
Format Journal Article
LanguageEnglish
Published United States IEEE 2023
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Electroencephalogram (EEG) classification has attracted great attention in recent years, and many models have been presented for this task. Nevertheless, EEG data vary from subject to subject, which may lead to the performance of a classifier degrades due to individual differences. To collect enough labeled data to model would address the issue, but it is often time-consuming and labor-intensive. In this paper, we propose a new multi-source transfer learning method based on domain adversarial neural network for EEG classification. Specifically, we design a domain adversarial neural network, which includes a feature extractor, a classifier, and a domain discriminator, and therefore reduce the domain shift to achieve the purpose. In addition, a unified multi-source optimization framework is constructed to further improve the performance, and the result for EEG classification is induced by the weighted combination of the predictions from multiple source domains. Experiments on three publicly available EEG datasets validate the advantages of the proposed method.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:1534-4320
1558-0210
1558-0210
DOI:10.1109/TNSRE.2022.3219418