Micro-expression recognition based on Spatio-temporal capsule network

Micro-expression (ME) is a subtle and spontaneous natural facial mechanism that is one of the essential psychological stress responses. Due to its accuracy and uncontrollability for psychological expressions, ME has crucial applications in many psychologically related fields. However, due to small d...

Full description

Saved in:
Bibliographic Details
Published inIEEE access Vol. 11; p. 1
Main Authors Shang, Ziyang, Liu, Jie, Li, Xinfu
Format Journal Article
LanguageEnglish
Published Piscataway IEEE 01.01.2023
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Micro-expression (ME) is a subtle and spontaneous natural facial mechanism that is one of the essential psychological stress responses. Due to its accuracy and uncontrollability for psychological expressions, ME has crucial applications in many psychologically related fields. However, due to small data volume and high data redundancy, existing micro-expression recognition (MER) methods cannot balance accuracy and recognition speed. Therefore, we propose a deep learning algorithm based on a spatio-temporal capsule network (STCP-Net). It reduces recognition time while ensuring the accuracy rate. STCP-Net consists of four parts: a jitter removal module, a differential feature extraction module, a spatio-temporal capsule (STCP) module, and a fully connected layer. The first two modules extract diversifying differential features more accurately and reduce the impact of head jitter. The STCP module extracts spatio-temporal features progressively layer by layer, fully considering the temporal and spatial relationship between features. Finally, this study conducts sufficient experiments based on the Leave One Subject Out (LOSO) cross-validation protocol with standard datasets. The final results and analysis show that the algorithm is advanced and effective.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2023.3242871