Non-intrusive Gesture Recognition in Real Companion Environments

Automatic gesture recognition pushes Human-Computer Interaction (HCI) closer to human-human interaction. Although gesture recognition technologies have been successfully applied to real-world applications, there are still several problems that need to be addressed for wider application of HCI system...

Full description

Saved in:
Bibliographic Details
Published inCompanion Technology pp. 321 - 343
Main Authors Handrich, Sebastian, Rashid, Omer, Al-Hamadi, Ayoub
Format Book Chapter
LanguageEnglish
Published Switzerland Springer International Publishing AG 01.01.2017
Springer International Publishing
SeriesCognitive Technologies
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Automatic gesture recognition pushes Human-Computer Interaction (HCI) closer to human-human interaction. Although gesture recognition technologies have been successfully applied to real-world applications, there are still several problems that need to be addressed for wider application of HCI systems: Firstly, gesture-recognition systems require a robust tracking of relevant body parts, which is challenging, since the human body is capable of an enormous range of poses. Therefore, a pose estimation approach that identifies body parts based on geodetic distances is proposed. Further, the generation of synthetic data, which is essential for training and evaluation purposes, is presented. A second problem is that gestures are spatio-temporal patterns that can vary in shape, trajectory or duration, even for the same person. Static patterns are recognized using geometrical and statistical features which are invariant to translation, rotation and scaling. Moreover, stochastical models like Hidden Markov Models and Conditional Random Fields applied to quantized trajectories are employed to classify dynamic patterns. Lastly, a non-gesture model-based spotting approach is proposed that separates meaningful gestures from random hand movements (spotting).
ISBN:3319436643
9783319436647
ISSN:1611-2482
2197-6635
DOI:10.1007/978-3-319-43665-4_16