EEG-Based Emotion Recognition With Haptic Vibration by a Feature Fusion Method

Emotion recognition based on electroencephalogram (EEG) signals has been one of the most active research topics of affective computing. In previous studies of emotion recognition, the selection of stimulus sources was usually focused on single stimuli, such as visual or auditory. In this work, we pr...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on instrumentation and measurement Vol. 71; pp. 1 - 11
Main Authors Li, Dahua, Yang, Zhiyi, Hou, Fazheng, Kang, Qiaoju, Liu, Shuang, Song, Yu, Gao, Qiang, Dong, Enzeng
Format Journal Article
LanguageEnglish
Published New York IEEE 2022
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Emotion recognition based on electroencephalogram (EEG) signals has been one of the most active research topics of affective computing. In previous studies of emotion recognition, the selection of stimulus sources was usually focused on single stimuli, such as visual or auditory. In this work, we propose a novel emotional stimulation scheme that synchronizes haptic vibration with audiovisual content to form a mixed sense of visual-auditory-haptic to trigger emotions. Fifteen subjects were recruited to watch the four kinds of emotional movie clips (happiness, fear, sadness, and neutral) with haptic or not, and their EEG signals were collected simultaneously. The power spectral density (PSD) feature, differential entropy (DE) feature, wavelet entropy (WE) feature, and brain function network (BFN) feature were extracted and fused to reflect the time-frequency-spatial domain of emotional EEG signals. The t-distributed stochastic neighbor embedding (t-SNE) was utilized for dimensionality reduction and feature selection. In addition, the fusion features are classified by the stacking ensemble learning framework. The experimental results show that the proposed haptic vibration strategy can enhance the activity of emotion-related brain regions, and the average classification accuracy was 85.46%.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0018-9456
1557-9662
DOI:10.1109/TIM.2022.3147882