Identifying emotions from facial expressions using a deep convolutional neural network-based approach
Sentiment identification on facial expression is an interesting study domain with applications in various disciplines, including security, health, and human-machine interfaces. The main goal of sentiment analysis is to decide an individual’s perspective on a topic or the document’s overall contextua...
Saved in:
Published in | Multimedia tools and applications Vol. 83; no. 6; pp. 15711 - 15732 |
---|---|
Main Authors | , , , , |
Format | Journal Article |
Language | English |
Published |
New York
Springer US
01.02.2024
Springer Nature B.V |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Sentiment identification on facial expression is an interesting study domain with applications in various disciplines, including security, health, and human-machine interfaces. The main goal of sentiment analysis is to decide an individual’s perspective on a topic or the document’s overall contextual polarity. In nonverbal communication, sentiment analysis plays a vital role in an individual’s feelings, reflecting on the faces. Researchers in this area are interested in improving models and methods and extracting various characteristics to provide a better computer prediction of sentiments. Sentiment polarities are mainly classified as positive, negative, and neutral. Many sentiment analysis approaches exist, but deep learning architectures can handle extensive data and provide better performances. We presented a solution based on the CNN (Convolutional Neural Network) model for handling this problem. This work uses the extended Cohn Kanade (CK+) and FER-2013 datasets for facial expression recognition study. Several existing architectures are used to evaluate the efficiency of the proposed model. Extensive experiments are carried out on both CK+ and FER-2013 data sets, and our framework outperforms state-of-the-art techniques. According to obtained results, the CNN3 model gives 79% and 95% accuracy for FER-2013 and CK+ datasets, respectively. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
ISSN: | 1573-7721 1380-7501 1573-7721 |
DOI: | 10.1007/s11042-023-16174-3 |