Decoding Subject-Driven Cognitive States from EEG Signals for Cognitive Brain–Computer Interface

In this study, we investigated the feasibility of using electroencephalogram (EEG) signals to differentiate between four distinct subject-driven cognitive states: resting state, narrative memory, music, and subtraction tasks. EEG data were collected from seven healthy male participants while perform...

Full description

Saved in:
Bibliographic Details
Published inBrain sciences Vol. 14; no. 5; p. 498
Main Authors Huang, Dingyong, Wang, Yingjie, Fan, Liangwei, Yu, Yang, Zhao, Ziyu, Zeng, Pu, Wang, Kunqing, Li, Na, Shen, Hui
Format Journal Article
LanguageEnglish
Published Switzerland MDPI AG 01.05.2024
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:In this study, we investigated the feasibility of using electroencephalogram (EEG) signals to differentiate between four distinct subject-driven cognitive states: resting state, narrative memory, music, and subtraction tasks. EEG data were collected from seven healthy male participants while performing these cognitive tasks, and the raw EEG signals were transformed into time–frequency maps using continuous wavelet transform. Based on these time–frequency maps, we developed a convolutional neural network model (TF-CNN-CFA) with a channel and frequency attention mechanism to automatically distinguish between these cognitive states. The experimental results demonstrated that the model achieved an average classification accuracy of 76.14% in identifying these four cognitive states, significantly outperforming traditional EEG signal processing methods and other classical image classification algorithms. Furthermore, we investigated the impact of varying lengths of EEG signals on classification performance and found that TF-CNN-CFA demonstrates consistent performance across different window lengths, indicating its strong generalization capability. This study validates the ability of EEG to differentiate higher cognitive states, which could potentially offer a novel BCI paradigm.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:2076-3425
2076-3425
DOI:10.3390/brainsci14050498