Learning CNN features from DE features for EEG-based emotion recognition
Recently, deep neural networks (DNNs) have shown the remarkable success of feature representations in computer vision, audio analysis, and natural language processing. Furthermore, DNNs have been used for electroencephalography (EEG) signal classification in recent studies on brain–computer interfac...
Saved in:
Published in | Pattern analysis and applications : PAA Vol. 23; no. 3; pp. 1323 - 1335 |
---|---|
Main Authors | , , , |
Format | Journal Article |
Language | English |
Published |
London
Springer London
01.08.2020
Springer Nature B.V |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Recently, deep neural networks (DNNs) have shown the remarkable success of feature representations in computer vision, audio analysis, and natural language processing. Furthermore, DNNs have been used for electroencephalography (EEG) signal classification in recent studies on brain–computer interface. However, most works use one-dimensional EEG features to learn DNNs that ignores the local information within multichannel or multiple frequency bands in the EEG signals. In this paper, we propose a novel emotion recognition method using a convolutional neural network (CNN) while preventing the loss of local information. The proposed method consists of two parts. The first part generates topology-preserving differential entropy features while keeping the distance from the center electrode to other electrodes. The second part learns the proposed CNN to estimate three-class emotional states (positive, neutral, negative). We evaluate our work on SEED dataset, including 62-channel EEG signals recorded from 15 subjects. Our experimental results demonstrate that the proposed method achieved superior performance on SEED dataset with an average accuracy of 90.41% with the visualization of extracted features from the proposed CNN using t-SNE to show our representation outperforms the other representations based on standard features for EEG analysis. Besides, with the additional experiment on VIG dataset to estimate the vigilance of EEG dataset, we show the off-the-shelf availability of the proposed method. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
ISSN: | 1433-7541 1433-755X |
DOI: | 10.1007/s10044-019-00860-w |