Realistic Facial Expression Reconstruction for VR HMD Users

We present a system for sensing and reconstructing facial expressions of the virtual reality (VR) head-mounted display (HMD) user. The HMD occludes a large portion of the user's face, which makes most existing facial performance capturing techniques intractable. To tackle this problem, a novel...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on multimedia Vol. 22; no. 3; pp. 730 - 743
Main Authors Lou, Jianwen, Wang, Yiming, Nduka, Charles, Hamedi, Mahyar, Mavridou, Ifigeneia, Wang, Fei-Yue, Yu, Hui
Format Journal Article
LanguageEnglish
Published Piscataway IEEE 01.03.2020
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:We present a system for sensing and reconstructing facial expressions of the virtual reality (VR) head-mounted display (HMD) user. The HMD occludes a large portion of the user's face, which makes most existing facial performance capturing techniques intractable. To tackle this problem, a novel hardware solution with electromyography (EMG) sensors being attached to the headset frame is applied to track facial muscle movements. For realistic facial expression recovery, we first reconstruct the user's 3D face from a single image and generate the personalized blendshapes associated with seven facial action units (AUs) on the most emotionally salient facial parts (ESFPs). We then utilize pre-processed EMG signals for measuring activations of AU-coded facial expressions to drive pre-built personalized blendshapes. Since facial expressions appear as important nonverbal cues of the subject's internal emotional states, we further investigate the relationship between six basic emotions - anger, disgust, fear, happiness, sadness and surprise, and detected AUs using a fern classifier. Experiments show the proposed system can accurately sense and reconstruct high-fidelity common facial expressions while providing useful information regarding the emotional state of the HMD user.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1520-9210
1941-0077
DOI:10.1109/TMM.2019.2933338