An evolutionary optimization method for selecting features for speech emotion recognition

Human-computer interactions benefit greatly from emotion recognition from speech. To promote a contact-free environment in this coronavirus disease 2019 (COVID'19) pandemic situation, most digitally based systems used speech-based devices. Consequently, this emotion detection from speech has ma...

Full description

Saved in:
Bibliographic Details
Published inTelkomnika Vol. 21; no. 1; pp. 159 - 167
Main Authors Bagadi, Kesava Rao, Sivappagari, Chandra Mohan Reddy
Format Journal Article
LanguageEnglish
Published Yogyakarta Ahmad Dahlan University 01.02.2023
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Human-computer interactions benefit greatly from emotion recognition from speech. To promote a contact-free environment in this coronavirus disease 2019 (COVID'19) pandemic situation, most digitally based systems used speech-based devices. Consequently, this emotion detection from speech has many beneficial applications for pathology. The vast majority of speech emotion recognition (SER) systems are designed based on machine learning or deep learning models. Therefore, need greater computing power and requirements. This issue was addressed by developing traditional algorithms for feature selection. Recent research has shown that nature-inspired or evolutionary algorithms such as equilibrium optimization (EO) and cuckoo search (CS) based meta-heuristic approaches are superior to the traditional feature selection (FS) models in terms of recognition performance. The purpose of this study is to investigate the impact of feature selection meta-heuristic approaches on emotion recognition from speech. To achieve this, we selected the rayerson audio-visual database of emotional speech and song (RAVDESS) database and obtained maximum recognition accuracy of 89.64% using the EO algorithm and 92.71% using the CS algorithm. For this final step, we plotted the associated precision and F1 score for each of the emotional classes.
ISSN:1693-6930
2302-9293
DOI:10.12928/telkomnika.v21i1.24261