A spatial-frequency-temporal 3D convolutional neural network for motor imagery EEG signal classification

Motor imagery (MI) EEG signal classification is a critical issue for brain–computer interface (BCI) systems. In traditional MI EEG machine learning algorithms, feature extraction and classification often have different objective functions, thus resulting in information loss. To solve this problem, a...

Full description

Saved in:
Bibliographic Details
Published inSignal, image and video processing Vol. 15; no. 8; pp. 1797 - 1804
Main Authors Miao, Minmin, Hu, Wenjun, Zhang, Wenbin
Format Journal Article
LanguageEnglish
Published London Springer London 01.11.2021
Springer Nature B.V
Subjects
Online AccessGet full text
ISSN1863-1703
1863-1711
DOI10.1007/s11760-021-01924-3

Cover

More Information
Summary:Motor imagery (MI) EEG signal classification is a critical issue for brain–computer interface (BCI) systems. In traditional MI EEG machine learning algorithms, feature extraction and classification often have different objective functions, thus resulting in information loss. To solve this problem, a novel spatial-frequency-temporal (SFT) 3D CNN model is proposed. Specifically, the energies of EEG signals located in multiple local SFT ranges are extracted to obtain a novel 3D MI EEG feature representation, and a novel 3D CNN model is designed to simultaneously learn the complex MI EEG features in the entire SFT domains and carry out classification. An extensive experimental study is implemented on two public EEG datasets to evaluate the effectiveness of our method. For BCI Competition III Dataset IVa, the average accuracy rate of five subjects obtained by the proposed method reaches 86.6% and yields 4.1% improvement over the state-of-the-art filter band common spatial pattern (FBCSP) method. For BCI Competition III dataset IIIa, by achieving an average accuracy rate of 91.85%, the proposed method outperforms the state-of-the-art dictionary pair learning (DPL) method by 4.44%.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1863-1703
1863-1711
DOI:10.1007/s11760-021-01924-3