The Diverse Gait Dataset: Gait Segmentation Using Inertial Sensors for Pedestrian Localization with Different Genders, Heights and Walking Speeds

Stride length estimation is one of the most crucial aspects of Pedestrian Dead Reckoning (PDR). Due to the measurement noise of inertial sensors, individual variances of pedestrians, and the uncertainty in pedestrians walking, there is a substantial error in the assessment of stride length, which ca...

Full description

Saved in:
Bibliographic Details
Published inSensors (Basel, Switzerland) Vol. 22; no. 4; p. 1678
Main Authors Huang, Chao, Zhang, Fuping, Xu, Zhengyi, Wei, Jianming
Format Journal Article
LanguageEnglish
Published Switzerland MDPI AG 21.02.2022
MDPI
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Stride length estimation is one of the most crucial aspects of Pedestrian Dead Reckoning (PDR). Due to the measurement noise of inertial sensors, individual variances of pedestrians, and the uncertainty in pedestrians walking, there is a substantial error in the assessment of stride length, which causes the accumulated deviation of Pedestrian Dead Reckoning (PDR). With the help of multi-gait analysis, which decomposes strides in time and space with greater detail and accuracy, a novel and revolutionary stride estimating model or scheme could improve the performance of PDR on different users. This paper presents a diverse stride gait dataset by using inertial sensors that collect foot movement data from people of different genders, heights, and walking speeds. The dataset contains 4690 walking strides data and 19,083 gait labels. Based on the dataset, we propose a threshold-independent stride segmentation algorithm called SDATW and achieve an F-measure of 0.835. We also provide the detailed results of recognizing four gaits under different walking speeds, demonstrating the utility of our dataset for helping train stride segmentation algorithms and gait detection algorithms.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:1424-8220
1424-8220
DOI:10.3390/s22041678