Bipartite Graph Adversarial Network for Subject-Independent Emotion Recognition

Emotions play a vital role in connecting and sharing with others. However, individuals with emotional disorders face challenges in expressing their emotions, affecting their social lives. Current artificial intelligence tools support this problem by enabling the development of methods that recognize...

Full description

Saved in:
Bibliographic Details
Published inIEEE journal of biomedical and health informatics Vol. PP; pp. 1 - 14
Main Authors Niaki, Marzieh, Dharia, Shyamal Y., Chen, Yangjun, Valderrama, Camilo E.
Format Journal Article
LanguageEnglish
Published United States IEEE 14.05.2025
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Emotions play a vital role in connecting and sharing with others. However, individuals with emotional disorders face challenges in expressing their emotions, affecting their social lives. Current artificial intelligence tools support this problem by enabling the development of methods that recognize emotions from electroencephalographic (EEG) signals. However, the high variability across individuals poses challenges in developing emotion recognition methods that generalize well across different subjects. Previous studies have addressed this issue using domain adversarial neural networks (DANN), in which differences in EEG among individuals are minimized. Although DANN has shown a potential to reduce domain variance, previous studies have little explored the inclusion of layer-specific components to further advance towards that goal. This study addressed this limitation by incorporating bipartite (BP) graphs in a DANN architecture to reduce variability further. We evaluated our model on five benchmark datasets for emotion recognition (SEED, SEED-IV, SEED-V, SEED-FRA, and SEED-GER) comprising a total of 62 individuals. Our model yielded an accuracy of 82.1%, 77.3%, 85.8%, 90.7%, and 87.6% for the SEED-V, SEED-IV, SEED, SEED-FRA, and SEED-GER datasets, respectively. Notably, these accuracies are either higher or comparable to the current state-of-the-art models. Furthermore, our model identified that the frontal, temporal, and parietal EEG channels are crucial for detecting emotions evoked by audiovisual stimuli.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:2168-2194
2168-2208
2168-2208
DOI:10.1109/JBHI.2025.3570187