CoAdapt: Collaborative Adaptation Between Latent EEG Feature Representation and Annotation for Emotion Decoding

Electroencephalogram (EEG) data contain rich neurophysiological information that can objectively express the emotional state of human beings. However, the inherent EEG characteristics such as nonstationarity and weakness, combined with the possible limited immersion and carry-over effect of subjects...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on instrumentation and measurement Vol. 74; pp. 1 - 16
Main Authors Gong, Xiaoxiao, Chen, Yuxin, Zhang, Pengfei, Peng, Yong, Fang, Jinglong, Cichocki, Andrzej
Format Journal Article
LanguageEnglish
Published IEEE 2025
Subjects
Online AccessGet full text
ISSN0018-9456
1557-9662
DOI10.1109/TIM.2025.3590828

Cover

Loading…
More Information
Summary:Electroencephalogram (EEG) data contain rich neurophysiological information that can objectively express the emotional state of human beings. However, the inherent EEG characteristics such as nonstationarity and weakness, combined with the possible limited immersion and carry-over effect of subjects during data collection experiments, may cause that the semantic meaning of extracted EEG feature vector cannot well match its annotated emotional state, dubbed the 'feature-label inconsistency dilemma in EEG-based emotion decoding. To this end, this article proposes to alleviate the side effect of feature-label inconsistency from both feature and label aspects. On the one hand, we explore more meaningful emotion-related EEG representation by the latent low-rank representation (LRR). On the other hand, we enhance the correspondence between the explored EEG representation and its annotated emotional state by a label dragging strategy. As a result, a collaborative adaptation (CoAdapt) model between latent EEG feature representation and its annotation is formed for efficient emotion decoding, which is implemented within the semi-supervised framework to better capture the properties of both the labeled and unlabeled EEG data. The experimental results on three publicly available datasets, SEED-IV, SEED-V and MPED, depict that: 1) CoAdapt achieves better emotion recognition performance in comparison with some related models; 2) the improvements of interclass separability and label margin are empirically evaluated, indicating the effectiveness of the purified EEG feature representation and rectified emotion annotation; and 3) some task-related results are identified from data-driven perspective, including the emotion carry-over effect and the discriminative spatial patterns in emotion decoding.
ISSN:0018-9456
1557-9662
DOI:10.1109/TIM.2025.3590828