Spatial-temporal network for fine-grained-level emotion EEG recognition

Electroencephalogram (EEG)-based affective computing brain–computer interfaces provide the capability for machines to understand human intentions. In practice, people are more concerned with the strength of a certain emotional state over a short period of time, which was called as fine-grained-level...

Full description

Saved in:
Bibliographic Details
Published inJournal of neural engineering Vol. 19; no. 3; pp. 36017 - 36028
Main Authors Ji, Youshuo, Li, Fu, Fu, Boxun, Li, Yang, Zhou, Yijin, Niu, Yi, Zhang, Lijian, Chen, Yuanfang, Shi, Guangming
Format Journal Article
LanguageEnglish
Published England IOP Publishing 01.06.2022
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Electroencephalogram (EEG)-based affective computing brain–computer interfaces provide the capability for machines to understand human intentions. In practice, people are more concerned with the strength of a certain emotional state over a short period of time, which was called as fine-grained-level emotion in this paper. In this study, we built a fine-grained-level emotion EEG dataset that contains two coarse-grained emotions and four corresponding fine-grained-level emotions. To fully extract the features of the EEG signals, we proposed a corresponding fine-grained emotion EEG network (FG-emotionNet) for spatial-temporal feature extraction. Each feature extraction layer is linked to raw EEG signals to alleviate overfitting and ensure that the spatial features of each scale can be extracted from the raw signals. Moreover, all previous scale features are fused before the current spatial-feature layer to enhance the scale features in the spatial block. Additionally, long short-term memory is adopted as the temporal block to extract the temporal features based on spatial features and classify the category of fine-grained emotions. Subject-dependent and cross-session experiments demonstrated that the performance of the proposed method is superior to that of the representative methods in emotion recognition and similar structure methods with proposed method.
Bibliography:JNE-105304.R1
ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:1741-2560
1741-2552
1741-2552
DOI:10.1088/1741-2552/ac6d7d