Adaptive Spatiotemporal Graph Convolutional Networks for Motor Imagery Classification

Classification of electroencephalogram-based motor imagery (MI-EEG) tasks is crucial in brain computer interfaces (BCI). In view of the characteristics of non-stationarity, time-variability and individual diversity of EEG signals, a novel framework based on graph neural network is proposed for MI-EE...

Full description

Saved in:
Bibliographic Details
Published inIEEE signal processing letters Vol. 28; pp. 219 - 223
Main Authors Sun, Biao, Zhang, Han, Wu, Zexu, Zhang, Yunyan, Li, Ting
Format Journal Article
LanguageEnglish
Published New York IEEE 2021
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text
ISSN1070-9908
1558-2361
DOI10.1109/LSP.2021.3049683

Cover

Loading…
More Information
Summary:Classification of electroencephalogram-based motor imagery (MI-EEG) tasks is crucial in brain computer interfaces (BCI). In view of the characteristics of non-stationarity, time-variability and individual diversity of EEG signals, a novel framework based on graph neural network is proposed for MI-EEG classification. First, an adaptive graph convolutional layer (AGCL) is constructed, by which the electrode channel information are integrated dynamically. We further propose an adaptive spatiotemporal graph convolutional network (ASTGCN), which fully exploits the characteristics of EEG signals in time domain and the channel correlations in spatial domain simultaneously. We execute the experiments using EEG signals recorded at motor imagery scenarios, where twenty-five healthy subjects performed MI movements of the right hand and feet to generate motor commands. Experimental results reveal that the proposed method outperforms state-of-the-art methods in terms of both classification quality and robustness. The advantages of ASTGCN include high accuracy, high efficiency, and robustness to cross-trial and cross-subject variations, making it an ideal candidate for long-term MI-EEG applications.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1070-9908
1558-2361
DOI:10.1109/LSP.2021.3049683