STILN: A Novel Spatial-Temporal Information Learning Network for EEG-based Emotion Recognition
The spatial correlations and the temporal contexts are indispensable in Electroencephalogram (EEG)-based emotion recognition. However, the learning of complex spatial correlations among several channels is a challenging problem. Besides, the temporal contexts learning is beneficial to emphasize the...
Saved in:
Main Authors | , , , |
---|---|
Format | Journal Article |
Language | English |
Published |
22.11.2022
|
Online Access | Get full text |
DOI | 10.48550/arxiv.2211.12103 |
Cover
Summary: | The spatial correlations and the temporal contexts are indispensable in
Electroencephalogram (EEG)-based emotion recognition. However, the learning of
complex spatial correlations among several channels is a challenging problem.
Besides, the temporal contexts learning is beneficial to emphasize the critical
EEG frames because the subjects only reach the prospective emotion during part
of stimuli. Hence, we propose a novel Spatial-Temporal Information Learning
Network (STILN) to extract the discriminative features by capturing the spatial
correlations and temporal contexts. Specifically, the generated 2D power
topographic maps capture the dependencies among electrodes, and they are fed to
the CNN-based spatial feature extraction network. Furthermore, Convolutional
Block Attention Module (CBAM) recalibrates the weights of power topographic
maps to emphasize the crucial brain regions and frequency bands. Meanwhile,
Batch Normalizations (BNs) and Instance Normalizations (INs) are appropriately
combined to relieve the individual differences. In the temporal contexts
learning, we adopt the Bidirectional Long Short-Term Memory Network (Bi-LSTM)
network to capture the dependencies among the EEG frames. To validate the
effectiveness of the proposed method, subject-independent experiments are
conducted on the public DEAP dataset. The proposed method has achieved the
outstanding performance, and the accuracies of arousal and valence
classification have reached 0.6831 and 0.6752 respectively. |
---|---|
DOI: | 10.48550/arxiv.2211.12103 |