Spatial-Temporal Shape and Motion Features for Dynamic Hand Gesture Recognition in Depth Video
Human-Computer Interaction (HCI) is one of the most interesting and challenging research topics in computer vision community. Among different HCI methods, hand gesture is the natural way of human-computer interaction and is focused on by many researchers. It allows the human to use their hand moveme...
Saved in:
Published in | International journal of image, graphics and signal processing Vol. 10; no. 9; pp. 17 - 26 |
---|---|
Main Authors | , , , |
Format | Journal Article |
Language | English |
Published |
Hong Kong
Modern Education and Computer Science Press
08.09.2018
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Human-Computer Interaction (HCI) is one of the most interesting and challenging research topics in computer vision community. Among different HCI methods, hand gesture is the natural way of human-computer interaction and is focused on by many researchers. It allows the human to use their hand movements to interact with machine easily and conveniently. With the birth of depth sensors, many new techniques have been developed and gained a lot of achievements. In this work, we propose a set of features extracted from depth maps for dynamic hand gesture recognition. We extract HOG2 for shape and appearance of hand in gesture representation. Moreover, to capture the movement of the hands, we propose a new feature named HOF2, which is extracted based on optical flow algorithm. These spatial-temporal descriptors are easy to comprehend and implement but perform very well in multi-class classification. They also have a low computational cost, so it is suitable for real-time recognition systems. Furthermore, we applied Robust PCA to reduce feature’s dimension to build robust and compact gesture descriptors. The robust results are evaluated by cross-validation scheme using a SVM classifier, which shows good outcome on challenging MSR Hand Gestures Dataset and VIVA Challenge Dataset with 95.51% and 55.95% in accuracy, respectively. |
---|---|
ISSN: | 2074-9074 2074-9082 |
DOI: | 10.5815/ijigsp.2018.09.03 |