EEG-based Emotion Recognition with Feature Fusion Networks

With the rapid development of Human-computer interaction, automatic emotion recognition based on multichannel electroencephalography (EEG) signals has attracted much attention in recent years. However, many existing studies on EEG-based emotion recognition ignore the correlation information between...

Full description

Saved in:
Bibliographic Details
Published inInternational journal of machine learning and cybernetics Vol. 13; no. 2; pp. 421 - 429
Main Authors Gao, Qiang, Yang, Yi, Kang, Qiaoju, Tian, Zekun, Song, Yu
Format Journal Article
LanguageEnglish
Published Berlin/Heidelberg Springer Berlin Heidelberg 01.02.2022
Springer Nature B.V
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:With the rapid development of Human-computer interaction, automatic emotion recognition based on multichannel electroencephalography (EEG) signals has attracted much attention in recent years. However, many existing studies on EEG-based emotion recognition ignore the correlation information between different EEG channels and cannot fully capture the contextual information of EEG signals. In this paper, a novel multi-feature fusion network is proposed, which consists of spatial and temporal neural network structures to learn discriminative spatio-temporal emotional information to recognize emotion. In this experiment, two common types of features, time-domain features (Hjorth, Differential Entropy, Sample Entropy) and frequency domain features (Power Spectral Density), are extracted. Then, to learn the spatial and contextual information, a convolution neural network, inspired by GoogleNet with inception structure, was adopted to capture the intrinsic spatial relationship of EEG electrodes and contextual information, respectively. Fully connected layers are used for feature fusion, instead of the SoftMax function, SVM is selected to classify the high-level emotion features. Finally, to evaluate the proposed method, we conduct leave-one-subject-out EEG emotion recognition experiments on the DEAP dataset, and the experiment results show that the proposed method achieves excellent performance and average emotion recognition accuracies of 80.52% and 75.22% in the valence and arousal classification tasks of the DEAP database, respectively.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1868-8071
1868-808X
DOI:10.1007/s13042-021-01414-5