A Relation Feature Comparison Network for Cross-Domain Recognition of Motion Intention

The ability to decode between subjects without the additional data recorded for training is crucial for brain-computer interface (BCI) applications. However, electroencephalogram (EEG) data have cross-session and cross-subject variability due to external factors and individual differences. Therefore...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on instrumentation and measurement Vol. 73; pp. 1 - 13
Main Authors Xu, Jiacan, Li, Donglin, Zhou, Peng, Zhang, Yuxian, Wang, Zinan, Ma, Dazhong
Format Journal Article
LanguageEnglish
Published New York IEEE 2024
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:The ability to decode between subjects without the additional data recorded for training is crucial for brain-computer interface (BCI) applications. However, electroencephalogram (EEG) data have cross-session and cross-subject variability due to external factors and individual differences. Therefore cross-domain EEG signal recognition remains challenging. To solve this problem, we propose a feature relation contrastive network (FRCN) in this article. First, we use covariance matrices to align the data distribution between in source and target domains to reduce the difference. Second, we extract feature representations by the pretrained embedding network and align the correlations of the features using a nonlinear transformation. Then, we propose a source domain selection method to measure the similarity of motion-related data distributions between domains by utilizing resting-state data from different domains and the selected appropriate source domain data for fine-tuning. Finally, we propose a metric learning-based interdomain relation contrastive module (RCM) to learn multiple nonlinear distance metrics based on different levels of features simultaneously, which enables interdomain comparison learning and accurately compares the relations between samples to reduce the negative migration. For testing the target task, effective matching and similarity comparison function from multiple abstraction-level features jointly alleviates the reliance on the embedded ability to generate linearly separable features. The experimental results show that the FRCN achieves better results on the BCI Competition IV II-a and II-b dataset, and the ablation experiments validate the effectiveness of the method.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0018-9456
1557-9662
DOI:10.1109/TIM.2024.3420350