Learning emotional states using personalized calibration tasks

A method for determining an emotional state of a subject taking an assessment. The method includes eliciting predicted facial expressions from a subject administered questions each intended to elicit a certain facial expression that conveys a baseline characteristic of the subject; receiving a video...

Full description

Saved in:
Bibliographic Details
Main Authors Clar Megan, Shreve Matthew Adam, Bala Raja, Emmett Phillip J, Kumar Jayant, Harte Eric, Subramanian Jeyasri
Format Patent
LanguageEnglish
Published 19.09.2017
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:A method for determining an emotional state of a subject taking an assessment. The method includes eliciting predicted facial expressions from a subject administered questions each intended to elicit a certain facial expression that conveys a baseline characteristic of the subject; receiving a video sequence capturing the subject answering the questions; determining an observable physical behavior experienced by the subject across a series of frames corresponding to the sample question; associating the observed behavior with the emotional state that corresponds with the facial expression; and training a classifier using the associations. The method includes receiving a second video sequence capturing the subject during an assessment and applying features extracted from the second image data to the classifier for determining the emotional state of the subject in response to an assessment item administered during the assessment.
Bibliography:Application Number: US201615149284