Robust hand pose estimation using visual sensor in IoT environment

In Internet of Things (IoT) environments, visual sensors with good performance have been used to create and apply various kinds of image data. Particularly, in the field of human–computer interaction, the image sensor interface using human hands is applicable to sign language recognition, games, obj...

Full description

Saved in:
Bibliographic Details
Published inThe Journal of supercomputing Vol. 76; no. 7; pp. 5382 - 5401
Main Authors Kim, Sul-Ho, Jang, Seok-Woo, Park, Jin-Ho, Kim, Gye-Young
Format Journal Article
LanguageEnglish
Published New York Springer US 01.07.2020
Springer Nature B.V
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:In Internet of Things (IoT) environments, visual sensors with good performance have been used to create and apply various kinds of image data. Particularly, in the field of human–computer interaction, the image sensor interface using human hands is applicable to sign language recognition, games, object operation in virtual reality, and remote surgery. With the popularization of depth cameras, there has been a new interest in the research conducted in RGB images. Nevertheless, hand pose estimation is hard. Research on hand pose estimation has multiple issues, including high-dimensional degrees of freedom, shape changes, self-occlusion, and real-time condition. To address the issues, this study proposes the random forests-based method of hierarchically estimating hand pose in depth images. In this study, the hierarchical estimation method that individually handles hand palms and fingers with the use of an inverse matrix is utilized to address high-dimensional degrees of freedom, shape changes, and self-occlusion. For real-time execution, random forests using simple characteristics are applied. As shown in the experimental results of this study, the proposed hierarchical estimation method estimates the hand pose in input depth images more robustly and quickly than other existing methods.
ISSN:0920-8542
1573-0484
DOI:10.1007/s11227-019-03082-3