Emotion recognition using facial expressions in an immersive virtual reality application
Facial expression recognition (FER) is an important method to study and distinguish human emotions. In the virtual reality (VR) context, people’s emotions are instantly and naturally triggered and mobilized due to the high immersion and realism of VR. However, when people are wearing head mounted di...
Saved in:
Published in | Virtual reality : the journal of the Virtual Reality Society Vol. 27; no. 3; pp. 1717 - 1732 |
---|---|
Main Authors | , |
Format | Journal Article |
Language | English |
Published |
London
Springer London
01.09.2023
Springer Nature B.V |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Facial expression recognition (FER) is an important method to study and distinguish human emotions. In the virtual reality (VR) context, people’s emotions are instantly and naturally triggered and mobilized due to the high immersion and realism of VR. However, when people are wearing head mounted display (HMD) VR equipment, the eye regions will be covered. The FER accuracy will be reduced if the eye region information is discarded. Therefore, it is necessary to obtain the information of eye regions using other methods. The main difficulty in FER in an immersive VR context is that the conventional FER methods depend on public databases. The image facial information in the public databases is complete, so these methods are difficult to directly apply to the VR context. To solve this problem, this paper designs and implements a solution for FER in the VR context as follows. A real facial expression database collection scheme in the VR context is implemented by adding an infrared camera and infrared light source to the HMD. A virtual database construction method is presented for FER in the VR context, which can improve the generalization of models. A deep network named the multi-region facial expression recognition model is designed for FER in the VR context. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
ISSN: | 1359-4338 1434-9957 |
DOI: | 10.1007/s10055-022-00720-9 |