CiABL: Completeness-Induced Adaptative Broad Learning for Cross-Subject Emotion Recognition With EEG and Eye Movement Signals
Although multimodal physiological data from the central and peripheral nervous systems can objectively respond to human emotional states, the individual differences caused by non-stationary and low signal-to-noise properties bring several challenges to cross-subject emotion recognition tasks. Many p...
Saved in:
Published in | IEEE transactions on affective computing Vol. 15; no. 4; pp. 1970 - 1984 |
---|---|
Main Authors | , , , |
Format | Journal Article |
Language | English |
Published |
Piscataway
IEEE
01.10.2024
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Although multimodal physiological data from the central and peripheral nervous systems can objectively respond to human emotional states, the individual differences caused by non-stationary and low signal-to-noise properties bring several challenges to cross-subject emotion recognition tasks. Many previous studies usually focused on learning high correlation information between different modalities, which easily leads to incomplete descriptions of different physiological signals and difficulties in aligning critical emotional information. To tackle these challenges, this paper proposes a novel multimodal emotion recognition model for improving the generalization performance to unseen target domain subjects, termed Completeness-induced Adaptative Broad Learning (CiABL). The proposed CiABL can gradely explore the completeness modality representation that encompasses both modality-relevant and modality-independent information, avoiding the loss of performance due to spurious correlations from different modalities. Subsequently, a well-designed weighted representation distribution alignment mechanism of CiABL can appropriately align the marginal and conditional distributions to reduce the influences of individual differences greatly. Extensive experiments on the SEED and SEED-FRA datasets demonstrate the effectiveness and generalization of the proposed CiABL, which outperforms current state-of-the-art methods. In addition, CiABL can precisely quantify the importance of global features to properly explain the modality contribution and averaged activation patterns of the brain under cross-subject emotion recognition tasks. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
ISSN: | 1949-3045 1949-3045 |
DOI: | 10.1109/TAFFC.2024.3392791 |