The fNIRS-Based Emotion Recognition by Spatial Transformer and WGAN Data Augmentation Toward Developing a Novel Affective BCI
The affective brain-computer interface (aBCI) facilitates the objective identification or regulation of human emotions. Current aBCI mainly relies on electroencephalography (EEG). However, research shows that emotions involve a large-scale distributed brain network. Compared to electroencephalograph...
Saved in:
Published in | IEEE transactions on affective computing Vol. 16; no. 2; pp. 875 - 890 |
---|---|
Main Authors | , , , |
Format | Journal Article |
Language | English |
Published |
Piscataway
IEEE
01.04.2025
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | The affective brain-computer interface (aBCI) facilitates the objective identification or regulation of human emotions. Current aBCI mainly relies on electroencephalography (EEG). However, research shows that emotions involve a large-scale distributed brain network. Compared to electroencephalography (EEG), functional near-infrared spectroscopy (fNIRS) offers a higher spatial resolution. It holds greater potential in capturing emotional spatial information, which may foster the development of new affective Brain-Computer Interfaces (aBCI). We proposed a novel self-attention-based deep-learning transformer language model for fNIRS cross-subject emotion recognition, which could automatically learn the emotion's spatial attention weight information with strong interpretability. Besides, we performed data augmentation by introducing the wasserstein generative adversarial networks (WGAN). Results showed: (1) We achieved 84% three-category cross-subject emotion decoding accuracy. The spatial transformer module and WGAN improved the accuracy by 12.8% and 4.3%, respectively. (2) Compared with cutting-edge fNIRS research, we led by 10% in three-category decoding accuracy. (3) Compared with cutting-edge EEG research, we lead by 28% in arousal decoding accuracy, 10% in valence decoding accuracy, and 2% in three-category decoding accuracy. (4) Besides, our approach holds the potential to uncover the brain's spatial encoding mechanism of human emotion processing, providing a new direction for building interpretable artificial intelligence models. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
ISSN: | 1949-3045 1949-3045 |
DOI: | 10.1109/TAFFC.2024.3477302 |