EEG-based Emotion Recognition Using Graph Convolutional Network with Learnable Electrode Relations

Emotion recognition based on electroencephalography (EEG) plays a pivotal role in the field of affective computing, and graph convolutional neural network (GCN) has been proved to be an effective method and made considerable progress. Since the adjacency matrix that can describe the electrode relati...

Full description

Saved in:
Bibliographic Details
Published in2021 43rd Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC) Vol. 2021; pp. 5953 - 5957
Main Authors Jin, Ming, Chen, Hao, Li, Zhunan, Li, Jinpeng
Format Conference Proceeding Journal Article
LanguageEnglish
Published United States IEEE 01.11.2021
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Emotion recognition based on electroencephalography (EEG) plays a pivotal role in the field of affective computing, and graph convolutional neural network (GCN) has been proved to be an effective method and made considerable progress. Since the adjacency matrix that can describe the electrode relationships is critical in GCN, it becomes necessary to explore effective electrode relationships for GCN. However, the setting of the adjacency matrix and the corresponding value is empirical and subjective in emotion recognition, and whether it matches the target task remains to be discussed. To solve the problem, we proposed a graph convolutional network with learnable electrode relations (LR-GCN), which learns the adjacency matrix automatically in a goal-driven manner, including using self-attention to forward update the Laplacian matrix and using gradient propagation to backward update the adjacency matrix. Compared with previous works that use simple electrode relationships or only the feature information, LR-GCN achieved higher emotion recognition ability by extracting more reasonable electrode relationships during the training progress. We conducted a subject-dependent experiment on the SEED database and achieved recognition accuracy of 94.72% on the DE feature and 85.24% on the PSD feature. After visualizing the optimized Laplacian matrix, we found that the brain connections related to vision, hearing, and emotion have been enhanced.
ISSN:2694-0604
DOI:10.1109/EMBC46164.2021.9630195