EEG based emotion recognition using fusion feature extraction method
As a high-level function of the human brain, emotion is the external manifestation of people’s psychological characteristics. The emotion has a great impact on people’s personality and mental health. At the same time, emotion classification from electroencephalogram (EEG) signals have attracted much...
Saved in:
Published in | Multimedia tools and applications Vol. 79; no. 37-38; pp. 27057 - 27074 |
---|---|
Main Authors | , , , , , |
Format | Journal Article |
Language | English |
Published |
New York
Springer US
01.10.2020
Springer Nature B.V |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | As a high-level function of the human brain, emotion is the external manifestation of people’s psychological characteristics. The emotion has a great impact on people’s personality and mental health. At the same time, emotion classification from electroencephalogram (EEG) signals have attracted much attention. To improve the precision of EEG-based emotion recognition, we proposed a fused feature extraction method to complete the classification of three emotions (neutral, happiness, and sadness). The standardized movie clips were selected to induce the corresponding emotion and the EEG response of 10 participants is collected by Emotiv EPOC. This paper systematically compared two kinds of EEG features (power spectrum and wavelet energy entropy) and their fusion for emotion classification. To reduce the dimension of fused features, we used principal component analysis (PCA) for dimensionality reduction and feature selection. The support vector machine (SVM) classifier and the relevance vector machines (RVM) classifier were utilized for emotion recognition respectively. From experimental results, we found that the fusion of two kinds of features outperformed a single feature for emotion classification by both the SVM classifier and the RVM classifier, and the averaged classification accuracy was 89.17% and 91.18%, respectively. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
ISSN: | 1380-7501 1573-7721 |
DOI: | 10.1007/s11042-020-09354-y |