Automatic measurement of ad preferences from facial responses gathered over the Internet

We present an automated method for classifying “liking” and “desire to view again” of online video ads based on 3268 facial responses to media collected over the Internet. The results demonstrate the possibility for an ecologically valid, unobtrusive, evaluation of commercial “liking” and “desire to...

Full description

Saved in:
Bibliographic Details
Published inImage and vision computing Vol. 32; no. 10; pp. 630 - 640
Main Authors McDuff, Daniel, El Kaliouby, Rana, Senechal, Thibaud, Demirdjian, David, Picard, Rosalind
Format Journal Article
LanguageEnglish
Published Elsevier B.V 01.10.2014
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:We present an automated method for classifying “liking” and “desire to view again” of online video ads based on 3268 facial responses to media collected over the Internet. The results demonstrate the possibility for an ecologically valid, unobtrusive, evaluation of commercial “liking” and “desire to view again”, strong predictors of marketing success, based only on facial responses. The area under the curve for the best “liking” classifier was 0.82 when using a challenging leave-one-commercial-out testing regime (accuracy=81%). We build on preliminary findings and show that improved smile detection can lead to a reduction in misclassifications. Comparison of the two smile detection algorithms showed that improved smile detection helps correctly classify responses recorded in challenging lighting conditions and those in which the expressions were subtle. Temporal discriminative approaches to classification performed most strongly showing that temporal information about an individual's response is important; it is not just how much a viewer smiles but when they smile. The technique could be employed in personalizing video content that is presented to people while they view videos over the Internet or in copy testing of ads to unobtrusively quantify ad effectiveness.
ISSN:0262-8856
1872-8138
DOI:10.1016/j.imavis.2014.01.004