A blendshape model for mapping facial motions to an android

An important part of natural, and therefore effective, communication is facial motion. The android Repliee Q2 should therefore display realistic facial motion. In computer graphics animation, such motion is created by mapping human motion to the animated character. This paper proposes a method for m...

Full description

Saved in:
Bibliographic Details
Published in2007 IEEE/RSJ International Conference on Intelligent Robots and Systems pp. 542 - 547
Main Authors Wilbers, F., Ishi, C., Ishiguro, H.
Format Conference Proceeding
LanguageEnglish
Japanese
Published IEEE 01.10.2007
Subjects
Online AccessGet full text
ISBN9781424409112
142440911X
ISSN2153-0858
DOI10.1109/IROS.2007.4399394

Cover

Loading…
More Information
Summary:An important part of natural, and therefore effective, communication is facial motion. The android Repliee Q2 should therefore display realistic facial motion. In computer graphics animation, such motion is created by mapping human motion to the animated character. This paper proposes a method for mapping human facial motion to the android. This is done using a linear model of the android, based on blendshape models used in computer graphics. The model is derived from motion capture of the android and therefore also models the android's physical limitations. The paper shows that the blendshape method can be successfully applied to the android. Also, it is shown that a linear model is sufficient for representing android facial motion, which means control can be very straightforward. Measurements of the produced motion identify the physical limitations of the android and allow identifying the main areas for improvement of the android design.
ISBN:9781424409112
142440911X
ISSN:2153-0858
DOI:10.1109/IROS.2007.4399394