Cross-Subject Domain Adaptation for Classifying Working Memory Load with Multi-Frame EEG Images
Working memory (WM), denoting the information temporally stored in the mind, is a fundamental research topic in the field of human cognition. Electroencephalograph (EEG), which can monitor the electrical activity of the brain, has been widely used in measuring the level of WM. However, one of the cr...
Saved in:
Main Authors | , , |
---|---|
Format | Journal Article |
Language | English |
Published |
12.06.2021
|
Subjects | |
Online Access | Get full text |
DOI | 10.48550/arxiv.2106.06769 |
Cover
Summary: | Working memory (WM), denoting the information temporally stored in the mind,
is a fundamental research topic in the field of human cognition.
Electroencephalograph (EEG), which can monitor the electrical activity of the
brain, has been widely used in measuring the level of WM. However, one of the
critical challenges is that individual differences may cause ineffective
results, especially when the established model meets an unfamiliar subject. In
this work, we propose a cross-subject deep adaptation model with spatial
attention (CS-DASA) to generalize the workload classifications across subjects.
First, we transform EEG time series into multi-frame EEG images incorporating
spatial, spectral, and temporal information. First, the Subject-Shared module
in CS-DASA receives multi-frame EEG image data from both source and target
subjects and learns the common feature representations. Then, in the
subject-specific module, the maximum mean discrepancy is implemented to measure
the domain distribution divergence in a reproducing kernel Hilbert space, which
can add an effective penalty loss for domain adaptation. Additionally, the
subject-to-subject spatial attention mechanism is employed to focus on the
discriminative spatial features from the target image data. Experiments
conducted on a public WM EEG dataset containing 13 subjects show that the
proposed model is capable of achieving better performance than existing
state-of-the-art methods. |
---|---|
DOI: | 10.48550/arxiv.2106.06769 |