MetaEmotionNet: Spatial-Spectral-Temporal based Attention 3D Dense Network with Meta-learning for EEG Emotion Recognition
Emotion recognition has become an important area in affective computing. Emotion recognition based on multi-channel electroencephalogram (EEG) signals has gradually become popular in recent years. However, on the one hand, how to make full use of different EEG features and the discriminative local p...
Saved in:
Published in | IEEE transactions on instrumentation and measurement p. 1 |
---|---|
Main Authors | , , , , , , , |
Format | Journal Article |
Language | English |
Published |
IEEE
03.12.2023
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Emotion recognition has become an important area in affective computing. Emotion recognition based on multi-channel electroencephalogram (EEG) signals has gradually become popular in recent years. However, on the one hand, how to make full use of different EEG features and the discriminative local patterns among the features for various emotions is challenging. Existing methods ignore the complementarity among the spatial-spectral-temporal features and discriminative local patterns in all features, which limits the classification performance. On the other hand, when dealing with cross-subject emotion recognition, existing transfer learning methods need a lot of training data. At the same time, it is extremely expensive and time-consuming to collect the labeled EEG data, which is not conducive to the wide application of emotion recognition models for new subjects. To solve the above challenges, we propose a novel spatial-spectral-temporal based attention 3D dense network with meta-learning, named MetaEmotionNet, for emotion recognition. Specifically, MetaEmotionNet integrates the spatial-spectral-temporal features simultaneously in a unified network framework through two-stream fusion. At the same time, the 3D attention mechanism can adaptively explore discriminative local patterns. In addition, a meta-learning algorithm is applied to reduce dependence on training data. Experiments demonstrate that the MetaEmotionNet is superior to the baseline models on two benchmark datasets. |
---|---|
ISSN: | 0018-9456 |
DOI: | 10.1109/TIM.2023.3338676 |