A Pre-Activation Residual Convolutional Network With Attention Modules for High-Resolution Segmented EEG Emotion Recognition
Emotion recognition based on electroencephalography (EEG) signals has attracted considerable research interest over the past few years and several potential applications have been proposed such as enhancing human-computer interaction, improving mental health diagnosis, and fine-tuning the customer e...
Saved in:
Published in | IEEE access Vol. 13; pp. 16303 - 16313 |
---|---|
Main Authors | , , |
Format | Journal Article |
Language | English |
Published |
Piscataway
IEEE
2025
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Emotion recognition based on electroencephalography (EEG) signals has attracted considerable research interest over the past few years and several potential applications have been proposed such as enhancing human-computer interaction, improving mental health diagnosis, and fine-tuning the customer experience at the marketing level. This paper introduces a novel model, called Pre-Activation Residual Convolutional Network with Attention Modules (PRCN-AM), designed to enhance the accuracy and robustness of emotion recognition based on EEG signals. PRCN-AM combines residual convolutional layers with pre-activation and attention modules to effectively capture and analyze the complex spatial-temporal patterns inherent in EEG signals. Two experimental procedures, namely, subject-dependent and subject-independent, were conducted and different time segmentations on the preprocessing stage were tested. The suggested exploitation of the temporal dynamics of the EEG signals in emotion recognition turns out to be useful, as classification accuracies of up to 99.51% and 97.51% on SEED and SEED-IV datasets have been achieved, respectively, thus, outperforming the current state-of-the-art models. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
ISSN: | 2169-3536 2169-3536 |
DOI: | 10.1109/ACCESS.2025.3530567 |