WiGRUNT: WiFi-Enabled Gesture Recognition Using Dual-Attention Network

Gestures constitute an important form of nonverbal communication where bodily actions are used for delivering messages alone or in parallel with spoken words. Recently, there exists an emerging trend of WiFi sensing-enabled gesture recognition due to its inherent merits like remote sensing, non-line...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on human-machine systems Vol. 52; no. 4; pp. 736 - 746
Main Authors Gu, Yu, Zhang, Xiang, Wang, Yantong, Wang, Meng, Yan, Huan, Ji, Yusheng, Liu, Zhi, Li, Jianhua, Dong, Mianxiong
Format Journal Article
LanguageEnglish
Published New York IEEE 01.08.2022
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Gestures constitute an important form of nonverbal communication where bodily actions are used for delivering messages alone or in parallel with spoken words. Recently, there exists an emerging trend of WiFi sensing-enabled gesture recognition due to its inherent merits like remote sensing, non-line-of-sight covering, and privacy-friendly. However, current WiFi-based approaches mainly reply on domain-specific training since they don't know "where to look" and "when to look." To this end, we propose WiGRUNT, a WiFi-enabled gesture recognition system using dual-attention network, to mimic how a keen human being intercepting a gesture regardless of the environment variations. The key insight is to train the network to dynamically focus on the domain-independent features of a gesture on the WiFi channel state information via a spatial-temporal dual-attention mechanism. WiGRUNT roots in a deep residual network (ResNet) backbone to evaluate the importance of spatial-temporal clues and exploit their inbuilt sequential correlations for fine-grained gesture recognition. We evaluate WiGRUNT on the open Widar3 dataset and show that it significantly outperforms its state-of-the-art rivals by achieving the best-ever performance in-domain or cross-domain.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:2168-2291
2168-2305
DOI:10.1109/THMS.2022.3163189