Dynamic Gesture Analysis for Distinguishing Between Intentional and Unconscious Motions

In human communication, nonverbal information such as gestures and facial expressions plays greater role than language, and it is known that gestures serve as a major channel when designing an intimate conversation systems between human and robots. However, one of the chief problem with such gesture...

Full description

Saved in:
Bibliographic Details
Published inAdvances in Human Factors and System Interactions pp. 35 - 42
Main Author Naka, Toshiya
Format Book Chapter
LanguageEnglish
Published Cham Springer International Publishing
SeriesAdvances in Intelligent Systems and Computing
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:In human communication, nonverbal information such as gestures and facial expressions plays greater role than language, and it is known that gestures serve as a major channel when designing an intimate conversation systems between human and robots. However, one of the chief problem with such gesture-based interaction is that it’s difficult to realize the effective actions and distinguish reliably between unconscious and intentional gestures: they tend to respond erroneously to unconscious movements, which impedes the natural communication. In this study, the authors propose a method for analyzing the mechanisms of effective gestures using dynamics: they have extended their analytical method to specifically identify intentional gestures, and found that they can be quantified by the value and slight changes in the torque of the main joints. Humans tend to add “preparation” and “follow-through” motions just before and after the intentional motion, and each behavior can be distinguished by using the “undershoot” or “overshoot” value if torque changes are measured with high precision. These proposed method has the potential not only to solve the problem facing the gesture-based interface but also to design human and robots communication strategy which can exceed the “uncanny valley” [1].
ISBN:9783319419558
3319419552
ISSN:2194-5357
2194-5365
DOI:10.1007/978-3-319-41956-5_4