DAST: A Domain-Adaptive Learning Combining Spatio-Temporal Dynamic Attention for Electroencephalography Emotion Recognition

Multimodal emotion recognition with EEG-based have become mainstream in affective computing. However, previous studies mainly focus on perceived emotions (including posture, speech or face expression et al.) of different subjects, while the lack of research on induced emotions (including video or mu...

Full description

Saved in:
Bibliographic Details
Published inIEEE journal of biomedical and health informatics Vol. 28; no. 5; pp. 2512 - 2523
Main Authors Jin, Hao, Gao, Ying, Wang, Tingting, Gao, Ping
Format Journal Article
LanguageEnglish
Published United States IEEE 01.05.2024
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Multimodal emotion recognition with EEG-based have become mainstream in affective computing. However, previous studies mainly focus on perceived emotions (including posture, speech or face expression et al.) of different subjects, while the lack of research on induced emotions (including video or music et al.) limited the development of two-ways emotions. To solve this problem, we propose a multimodal domain adaptive method based on EEG and music called the DAST, which uses spatio-temporal adaptive attention (STA-attention) to globally model the EEG and maps all embeddings dynamically into high-dimensionally space by adaptive space encoder (ASE). Then, adversarial training is performed with domain discriminator and ASE to learn invariant emotion representations. Furthermore, we conduct extensive experiments on the DEAP dataset, and the results show that our method can further explore the relationship between induced and perceived emotions, and provide a reliable reference for exploring the potential correlation between EEG and music stimulation.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:2168-2194
2168-2208
2168-2208
DOI:10.1109/JBHI.2023.3307606