Attention mechanism-based CNN for facial expression recognition

Facial expression recognition is a hot research topic and can be applied in many computer vision fields, such as human–computer interaction, affective computing and so on. In this paper, we propose a novel end-to-end network with attention mechanism for automatic facial expression recognition. The n...

Full description

Saved in:
Bibliographic Details
Published inNeurocomputing Vol. 411; pp. 340 - 350
Main Authors Li, Jing, Jin, Kan, Zhou, Dalin, Kubota, Naoyuki, Ju, Zhaojie
Format Journal Article
LanguageEnglish
Japanese
Published Elsevier B.V 21.10.2020
Elsevier BV
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Facial expression recognition is a hot research topic and can be applied in many computer vision fields, such as human–computer interaction, affective computing and so on. In this paper, we propose a novel end-to-end network with attention mechanism for automatic facial expression recognition. The new network architecture consists of four parts, i.e., the feature extraction module, the attention module, the reconstruction module and the classification module. The LBP features extract image texture information and then catch the small movements of the faces, which can improve the network performance. Attention mechanism can make the neural network pay more attention to useful features. We combine LBP features and attention mechanism to enhance the attention model to obtain better results. In addition, we collected and labelled a new facial expression dataset of seven expressions from 35 subjects aged from 20 to 25. For each subject, we captured both RGB images and depth images with a Microsoft Kinect sensor. For each image type, there are 245 image sequences, each of which contains 110 images, resulting in 26,950 images in total. We apply the newly proposed method to our own dataset and four representative expression datasets, i.e., JAFFE, CK+, FER2013 and Oulu-CASIA. The experimental results demonstrate the feasibility and effectiveness of the proposed method.
ISSN:0925-2312
1872-8286
DOI:10.1016/j.neucom.2020.06.014