Identifying relevant EEG channels for subject-independent emotion recognition using attention network layers

Electrical activity recorded with electroencephalography (EEG) enables the development of predictive models for emotion recognition. These models can be built using two approaches: subject-dependent and subject-independent. Although subject-independent models offer greater practical utility compared...

Full description

Saved in:
Bibliographic Details
Published inFrontiers in psychiatry Vol. 16; p. 1494369
Main Authors Valderrama, Camilo E, Sheoran, Anshul
Format Journal Article
LanguageEnglish
Published Switzerland Frontiers Media S.A 10.02.2025
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Electrical activity recorded with electroencephalography (EEG) enables the development of predictive models for emotion recognition. These models can be built using two approaches: subject-dependent and subject-independent. Although subject-independent models offer greater practical utility compared to subject-dependent models, they face challenges due to the significant variability of EEG signals between individuals. One potential solution to enhance subject-independent approaches is to identify EEG channels that are consistently relevant across different individuals for predicting emotion. With the growing use of deep learning in emotion recognition, incorporating attention mechanisms can help uncover these shared predictive patterns. This study explores this method by applying attention mechanism layers to identify EEG channels that are relevant for predicting emotions in three independent datasets (SEED, SEED-IV, and SEED-V). The model achieved average accuracies of 79.3% (CI: 76.0-82.5%), 69.5% (95% CI: 64.2-74.8%) and 60.7% (95% CI: 52.3-69.2%) on these datasets, revealing that EEG channels located along the head circumference, including , , , , , , , , , and , are the most crucial for emotion prediction. These results emphasize the importance of capturing relevant electrical activity from these EEG channels, thereby facilitating the prediction of emotions evoked by audiovisual stimuli in subject-independent approaches.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
Reviewed by: Jiahui Pan, South China Normal University, China
Edited by: Panagiotis Tzirakis, Hume AI, United States
Konstantinos Barmpas, Imperial College London, United Kingdom
ISSN:1664-0640
1664-0640
DOI:10.3389/fpsyt.2025.1494369