Bi-Directional Long Short-Term Memory-Based Gait Phase Recognition Method Robust to Directional Variations in Subject’s Gait Progression Using Wearable Inertial Sensor

Inertial Measurement Unit (IMU) sensor-based gait phase recognition is widely used in medical and biomechanics fields requiring gait data analysis. However, there are several limitations due to the low reproducibility of IMU sensor attachment and the sensor outputs relative to a fixed reference fram...

Full description

Saved in:
Bibliographic Details
Published inSensors (Basel, Switzerland) Vol. 24; no. 4; p. 1276
Main Authors Jeon, Haneul, Lee, Donghun
Format Journal Article
LanguageEnglish
Published Switzerland MDPI AG 17.02.2024
MDPI
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Inertial Measurement Unit (IMU) sensor-based gait phase recognition is widely used in medical and biomechanics fields requiring gait data analysis. However, there are several limitations due to the low reproducibility of IMU sensor attachment and the sensor outputs relative to a fixed reference frame. The prediction algorithm may malfunction when the user changes their walking direction. In this paper, we propose a gait phase recognition method robust to user body movements based on a floating body-fixed frame (FBF) and bi-directional long short-term memory (bi-LSTM). Data from four IMU sensors attached to the shanks and feet on both legs of three subjects, collected via the FBF method, are processed through preprocessing and the sliding window label overlapping method before inputting into the bi-LSTM for training. To improve the model’s recognition accuracy, we selected parameters that influence both training and test accuracy. We conducted a sensitivity analysis using a level average analysis of the Taguchi method to identify the optimal combination of parameters. The model, trained with optimal parameters, was validated on a new subject, achieving a high test accuracy of 86.43%.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:1424-8220
1424-8220
DOI:10.3390/s24041276