BiTCAN: A emotion recognition network based on saliency in brain cognition

In recent years, with the continuous development of artificial intelligence and brain-computer interfaces, emotion recognition based on electroencephalogram (EEG) signals has become a prosperous research direction. Due to saliency in brain cognition, we construct a new spatio-temporal convolutional...

Full description

Saved in:
Bibliographic Details
Published inMathematical biosciences and engineering : MBE Vol. 20; no. 12; pp. 21537 - 21562
Main Authors An, Yanling, Hu, Shaohai, Liu, Shuaiqi, Li, Bing
Format Journal Article
LanguageEnglish
Published United States AIMS Press 05.12.2023
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:In recent years, with the continuous development of artificial intelligence and brain-computer interfaces, emotion recognition based on electroencephalogram (EEG) signals has become a prosperous research direction. Due to saliency in brain cognition, we construct a new spatio-temporal convolutional attention network for emotion recognition named BiTCAN. First, in the proposed method, the original EEG signals are de-baselined, and the two-dimensional mapping matrix sequence of EEG signals is constructed by combining the electrode position. Second, on the basis of the two-dimensional mapping matrix sequence, the features of saliency in brain cognition are extracted by using the Bi-hemisphere discrepancy module, and the spatio-temporal features of EEG signals are captured by using the 3-D convolution module. Finally, the saliency features and spatio-temporal features are fused into the attention module to further obtain the internal spatial relationships between brain regions, and which are input into the classifier for emotion recognition. Many experiments on DEAP and SEED (two public datasets) show that the accuracies of the proposed algorithm on both are higher than 97%, which is superior to most existing emotion recognition algorithms.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:1551-0018
1551-0018
DOI:10.3934/mbe.2023953