Subject-independent EEG emotion recognition with hybrid spatio-temporal GRU-Conv architecture
Recently, various deep learning frameworks have shown excellent performance in decoding electroencephalogram (EEG) signals, especially in human emotion recognition. However, most of them just focus on temporal features and ignore the features based on spatial dimensions. Traditional gated recurrent...
Saved in:
Published in | Medical & biological engineering & computing Vol. 61; no. 1; pp. 61 - 73 |
---|---|
Main Authors | , , |
Format | Journal Article |
Language | English |
Published |
Berlin/Heidelberg
Springer Berlin Heidelberg
01.01.2023
Springer Nature B.V |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Recently, various deep learning frameworks have shown excellent performance in decoding electroencephalogram (EEG) signals, especially in human emotion recognition. However, most of them just focus on temporal features and ignore the features based on spatial dimensions. Traditional gated recurrent unit (GRU) model performs well in processing time series data, and convolutional neural network (CNN) can obtain spatial characteristics from input data. Therefore, this paper introduces a hybrid GRU and CNN deep learning framework named GRU-Conv to fully leverage the advantages of both. Nevertheless, contrary to most previous GRU architectures, we retain the output information of all GRU units. So, the GRU-Conv model could extract crucial spatio-temporal features from EEG data. And more especially, the proposed model acquires the multi-dimensional features of multi-units after temporal processing in GRU and then uses CNN to extract spatial information from the temporal features. In this way, the EEG signals with different characteristics could be classified more accurately. Finally, the subject-independent experiment shows that our model has good performance on SEED and DEAP databases. The average accuracy of the former is 87.04%. The mean accuracy of the latter is 70.07% for arousal and 67.36% for valence.
Graphical abstract |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 content type line 23 |
ISSN: | 0140-0118 1741-0444 1741-0444 |
DOI: | 10.1007/s11517-022-02686-x |