Core-Brain-Network-Based Multilayer Convolutional Neural Network for Emotion Recognition

In this article, we propose a method for emotion classification based on multilayer convolutional neural network (MCNN) and combining differential entropy (DE) and brain network. First, we use continuous wavelet transform (CWT) to get the time-frequency representation (TFR) of electroencephalogram (...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on instrumentation and measurement Vol. 70; pp. 1 - 9
Main Authors Gao, Zhongke, Li, Rumei, Ma, Chao, Rui, Linge, Sun, Xinlin
Format Journal Article
LanguageEnglish
Published New York IEEE 2021
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:In this article, we propose a method for emotion classification based on multilayer convolutional neural network (MCNN) and combining differential entropy (DE) and brain network. First, we use continuous wavelet transform (CWT) to get the time-frequency representation (TFR) of electroencephalogram (EEG) signals on each channel and extract rich information from different frequency bands for subsequent analysis. Brain networks are then constructed in multiple bands to characterize the spatial connections hidden in the multichannel EEG signals. Based on brain networks, we further develop core brain networks through a set of key nodes determined by DE. These core brain networks are associated with brain activities and differ markedly between different emotional states. The final designed MCNN model takes DE features and core brain networks as inputs for emotion recognition. We evaluate our method on the SJTU emotion EEG dataset, and the average accuracy of 15 subjects achieves 91.45%. Utilizing the complementary features of DE and brain network, the proposed method provides an efficient framework for accurate emotion recognition from EEG signals.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0018-9456
1557-9662
DOI:10.1109/TIM.2021.3090164