Hand Gesture Recognition From Wrist-Worn Camera for Human-Machine Interaction

In this work, we study the ability to use hand gestures for human-machine interaction from wrist-worn sensors. Towards this goal, we design a wrist-worn prototype to capture RGB video stream of hand gestures. Then we built a new wrist-worn gesture dataset (named WiGes) with various subjects in inter...

Full description

Saved in:
Bibliographic Details
Published inIEEE access Vol. 11; pp. 53262 - 53274
Main Authors Nguyen, Hong-Quan, Le, Trung-Hieu, Tran, Trung-Kien, Tran, Hoang-Nhat, Tran, Thanh-Hai, Le, Thi-Lan, Vu, Hai, Pham, Cuong, Nguyen, Thanh Phuong, Nguyen, Huu Thanh
Format Journal Article
LanguageEnglish
Published Piscataway IEEE 2023
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:In this work, we study the ability to use hand gestures for human-machine interaction from wrist-worn sensors. Towards this goal, we design a wrist-worn prototype to capture RGB video stream of hand gestures. Then we built a new wrist-worn gesture dataset (named WiGes) with various subjects in interaction with home appliances in different environments. To the best of our knowledge, this is the first benchmark released for studying hand gestures from a wrist-worn camera. We then evaluate various CNN models for vision-based recognition. Furthermore, we deeply analyze the models that produce the best trade-off between accuracy, memory requirement, and computational cost. We point out that among studied architectures, MoviNet produces the highest accuracy. Then, we introduce a new MoviNet-based two-stream architecture that takes both RGB and optical flow into account. Our proposed architecture increases the Top-1 accuracy by 1.36% and 3.67% according to two evaluation protocols. Our dataset, baselines, and proposed model analysis give instructive recommendations for human-machine interaction using hand-held devices.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2023.3279845