Electroencephalography Emotion Recognition Based on Rhythm Information Entropy Extraction

Electroencephalography (EEG) is a physiological signal directly generated by the central nervous system. Brain rhythm is closely related to a person’s emotional state and is widely used for EEG emotion recognition. In previous studies, the rhythm specificity between different brain channels was seld...

Full description

Saved in:
Bibliographic Details
Published inJournal of advanced computational intelligence and intelligent informatics Vol. 28; no. 5; pp. 1095 - 1106
Main Authors Liu, Zhen-Tao, Xu, Xin, She, Jinhua, Yang, Zhaohui, Chen, Dan
Format Journal Article
LanguageEnglish
Published Tokyo Fuji Technology Press Co. Ltd 01.09.2024
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Electroencephalography (EEG) is a physiological signal directly generated by the central nervous system. Brain rhythm is closely related to a person’s emotional state and is widely used for EEG emotion recognition. In previous studies, the rhythm specificity between different brain channels was seldom explored. In this paper, the rhythm specificity of brain channels is studied to improve the accuracy of EEG emotion recognition. Variational mode decomposition is used to decompose rhythm signals and enhance features, and two kinds of information entropy, i.e., differential entropy (DE) and dispersion entropy (DispEn) are extracted. The rhythm being used to get the best result of single channel emotion recognition is selected as the representative rhythm, and the remove one method is employed to obtain rhythm information entropy feature. In the experiment, the DEAP database was used for EEG emotion recognition in valence-arousal space. The results showed that the best result of rhythm DE feature classification in the valence dimension is 77.04%, and the best result of rhythm DispEn feature classification in the arousal dimension is 79.25%.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1343-0130
1883-8014
DOI:10.20965/jaciii.2024.p1095