Evolutionary Ensemble Learning for EEG-Based Cross-Subject Emotion Recognition

Electroencephalogram (EEG) has been widely utilized in emotion recognition due to its high temporal resolution and reliability. However, the individual differences and non-stationary characteristics of EEG, along with the complexity and variability of emotions, pose challenges in generalizing emotio...

Full description

Saved in:
Bibliographic Details
Published inIEEE journal of biomedical and health informatics Vol. 28; no. 7; pp. 3872 - 3881
Main Authors Zhang, Hanzhong, Zuo, Tienyu, Chen, Zhiyang, Wang, Xin, Sun, Poly Z.H.
Format Journal Article
LanguageEnglish
Published United States IEEE 01.07.2024
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Electroencephalogram (EEG) has been widely utilized in emotion recognition due to its high temporal resolution and reliability. However, the individual differences and non-stationary characteristics of EEG, along with the complexity and variability of emotions, pose challenges in generalizing emotion recognition models across subjects. In this paper, an end-to-end framework is proposed to improve the performance of cross-subject emotion recognition. A novel evolutionary programming (EP)-based optimization strategy with neural network (NN) as the base classifier termed NN ensemble with EP (EPNNE) is designed for cross-subject emotion recognition. The effectiveness of the proposed method is evaluated on the publicly available DEAP, FACED, SEED, and SEED-IV datasets. Numerical results demonstrate that the proposed method is superior to state-of-the-art cross-subject emotion recognition methods. The proposed end-to-end framework for cross-subject emotion recognition aids biomedical researchers in effectively assessing individual emotional states, thereby enabling efficient treatment and interventions.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:2168-2194
2168-2208
2168-2208
DOI:10.1109/JBHI.2024.3384816