WiDIGR: Direction-Independent Gait Recognition System Using Commercial Wi-Fi Devices

Gait recognition enables many potential applications requiring identification. Wi-Fi-based gait recognition is predominant because of its noninvasive and ubiquitous advantages. However, since the gait information changes with the walking direction, the existing Wi-Fi-based gait recognition systems r...

Full description

Saved in:
Bibliographic Details
Published inIEEE internet of things journal Vol. 7; no. 2; pp. 1178 - 1191
Main Authors Zhang, Lei, Wang, Cong, Ma, Maode, Zhang, Daqing
Format Journal Article
LanguageEnglish
Published Piscataway IEEE 01.02.2020
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Gait recognition enables many potential applications requiring identification. Wi-Fi-based gait recognition is predominant because of its noninvasive and ubiquitous advantages. However, since the gait information changes with the walking direction, the existing Wi-Fi-based gait recognition systems require the subject to walk along a predetermined path. This direction dependence restriction impedes Wi-Fi-based gait recognition from being widely used. In order to address this issue, a direction-independent gait recognition system, called WiDIGR is proposed. WiDIGR can recognize a subject through the gait no matter what straight-line walking path it is. This relaxes the strict constraint of the other Wi-Fi-based gait recognition. Specifically, based on the Fresnel model, a series of signal processing techniques are proposed to eliminate the differences among induced signals caused by walking in different directions and generate a high-quality direction-independent signal spectrogram. Furthermore, effective features are extracted both manually and automatically from the direction-independent spectrogram. The experimental results in a typical indoor environment demonstrate the superior performance of WiDIGR, with mean accuracy ranging from 78.28% for a group of six subjects to 92.83% for a group of three.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:2327-4662
2372-2541
2327-4662
DOI:10.1109/JIOT.2019.2953488