A blendshape model for mapping facial motions to an android
An important part of natural, and therefore effective, communication is facial motion. The android Repliee Q2 should therefore display realistic facial motion. In computer graphics animation, such motion is created by mapping human motion to the animated character. This paper proposes a method for m...
Saved in:
Published in | 2007 IEEE/RSJ International Conference on Intelligent Robots and Systems pp. 542 - 547 |
---|---|
Main Authors | , , |
Format | Conference Proceeding |
Language | English Japanese |
Published |
IEEE
01.10.2007
|
Subjects | |
Online Access | Get full text |
ISBN | 9781424409112 142440911X |
ISSN | 2153-0858 |
DOI | 10.1109/IROS.2007.4399394 |
Cover
Loading…
Summary: | An important part of natural, and therefore effective, communication is facial motion. The android Repliee Q2 should therefore display realistic facial motion. In computer graphics animation, such motion is created by mapping human motion to the animated character. This paper proposes a method for mapping human facial motion to the android. This is done using a linear model of the android, based on blendshape models used in computer graphics. The model is derived from motion capture of the android and therefore also models the android's physical limitations. The paper shows that the blendshape method can be successfully applied to the android. Also, it is shown that a linear model is sufficient for representing android facial motion, which means control can be very straightforward. Measurements of the produced motion identify the physical limitations of the android and allow identifying the main areas for improvement of the android design. |
---|---|
ISBN: | 9781424409112 142440911X |
ISSN: | 2153-0858 |
DOI: | 10.1109/IROS.2007.4399394 |