CNN and LSTM based ensemble learning for human emotion recognition using EEG recordings

Emotion is a significant parameter in daily life and is considered an important factor for human interactions. The human-machine interactions and their advanced stages like humanoid robots essentially require emotional investigation. This paper proposes a novel method for human emotion recognition u...

Full description

Saved in:
Bibliographic Details
Published inMultimedia tools and applications Vol. 82; no. 4; pp. 4883 - 4896
Main Authors Iyer, Abhishek, Das, Srimit Sritik, Teotia, Reva, Maheshwari, Shishir, Sharma, Rishi Raj
Format Journal Article
LanguageEnglish
Published New York Springer US 01.02.2023
Springer Nature B.V
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Emotion is a significant parameter in daily life and is considered an important factor for human interactions. The human-machine interactions and their advanced stages like humanoid robots essentially require emotional investigation. This paper proposes a novel method for human emotion recognition using electroencephalogram (EEG) signals. We have considered three emotions namely neutral, positive, and negative. These EEG signals are separated into five frequency bands according to EEG rhythms and the differential entropy is computed over the different frequency band components. The convolution neural network (CNN) and long short-term memory (LSTM) based hybrid model is developed for accurate emotion detection. Further, the extracted features are fed to all three models for emotion recognition. Finally, an ensemble model combines the predictions of all three models. The proposed approach is validated on two datasets namely SEED and DEAP for EEG based emotion analysis. The developed method achieved 97.16% accuracy on SEED dataset for emotion classification. The experimental results indicate that the proposed approach is effective and yields better performance than the compared methods for EEG-based emotion analysis.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1380-7501
1573-7721
DOI:10.1007/s11042-022-12310-7