Unsupervised Gait Phase Estimation With Domain-Adversarial Neural Network and Adaptive Window

The performanceof previous machine learning models for gait phase is only satisfactory under limited conditions. First, they produce accurate estimations only when the ground truth of the gait phase (of the target subject) is known. In contrast, when the ground truth of a target subject is not used...

Full description

Saved in:
Bibliographic Details
Published inIEEE journal of biomedical and health informatics Vol. 26; no. 7; pp. 3373 - 3384
Main Authors Choi, Wonseok, Yang, Wonseok, Na, Jaeyoung, Park, Juneil, Lee, Giuk, Nam, Woochul
Format Journal Article
LanguageEnglish
Published United States IEEE 01.07.2022
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:The performanceof previous machine learning models for gait phase is only satisfactory under limited conditions. First, they produce accurate estimations only when the ground truth of the gait phase (of the target subject) is known. In contrast, when the ground truth of a target subject is not used to train an algorithm, the estimation error noticeably increases. Expensive equipment is required to precisely measure the ground truth of the gait phase. Thus, previous methods have practical shortcoming when they are optimized for individual users. To address this problem, this study introduces an unsupervised domain adaptation technique for estimation without the true gait phase of the target subject. Specifically, a domain-adversarial neural network was modified to perform regression on continuous gait phases. Second, the accuracy of previous models can be degraded by variations in stride time. To address this problem, this study developed an adaptive window method that actively considers changes in stride time. This model considerably reduces estimation errors for walking and running motions. Finally, this study proposed a new method to select the optimal source subject (among several subjects) by defining the similarity between sequential embedding features.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:2168-2194
2168-2208
2168-2208
DOI:10.1109/JBHI.2021.3137413