Compact convolutional transformer for subject-independent motor imagery EEG-based BCIs

Motor imagery electroencephalography (EEG) analysis is crucial for the development of effective brain-computer interfaces (BCIs), yet it presents considerable challenges due to the complexity of the data and inter-subject variability. This paper introduces EEGCCT, an application of compact convoluti...

Full description

Saved in:
Bibliographic Details
Published inScientific reports Vol. 14; no. 1; pp. 25775 - 11
Main Authors Keutayeva, Aigerim, Fakhrutdinov, Nail, Abibullaev, Berdakh
Format Journal Article
LanguageEnglish
Published London Nature Publishing Group UK 28.10.2024
Nature Publishing Group
Nature Portfolio
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Motor imagery electroencephalography (EEG) analysis is crucial for the development of effective brain-computer interfaces (BCIs), yet it presents considerable challenges due to the complexity of the data and inter-subject variability. This paper introduces EEGCCT, an application of compact convolutional transformers designed specifically to improve the analysis of motor imagery tasks in EEG. Unlike traditional approaches, EEGCCT model significantly enhances generalization from limited data, effectively addressing a common limitation in EEG datasets. We validate and test our models using the open-source BCI Competition IV datasets 2a and 2b, employing a Leave-One-Subject-Out (LOSO) strategy to ensure subject-independent performance. Our findings demonstrate that EEGCCT not only outperforms conventional models like EEGNet in standard evaluations but also achieves better performance compared to other advanced models such as Conformer, Hybrid s-CViT, and Hybrid t-CViT, while utilizing fewer parameters and achieving an accuracy of 70.12%. Additionally, the paper presents a comprehensive ablation study that includes targeted data augmentation, hyperparameter optimization, and architectural improvements.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:2045-2322
2045-2322
DOI:10.1038/s41598-024-73755-4