Wearables-based multi-task gait and activity segmentation using recurrent neural networks
Human activity recognition (HAR) and cycle analysis, such as gait analysis, have become an integral part of daily lives from gesture recognition to step counting. As the available data and the possible application areas grow, an efficient solution without the need of handcrafted feature extraction i...
Saved in:
Published in | Neurocomputing (Amsterdam) Vol. 432; pp. 250 - 261 |
---|---|
Main Authors | , , , |
Format | Journal Article |
Language | English |
Published |
Elsevier B.V
07.04.2021
|
Subjects | |
Online Access | Get full text |
ISSN | 0925-2312 1872-8286 |
DOI | 10.1016/j.neucom.2020.08.079 |
Cover
Loading…
Summary: | Human activity recognition (HAR) and cycle analysis, such as gait analysis, have become an integral part of daily lives from gesture recognition to step counting. As the available data and the possible application areas grow, an efficient solution without the need of handcrafted feature extraction is needed. We propose a multi-task recurrent neural network architecture that uses inertial sensor data to both segment and recognise activities and cycles. The solution is validated using three publicly available datasets consisting of more than 120 subjects and 8 activities, 6 of which are cyclic. Our architecture is smaller than comparable HAR models while being robust to different sensor placements and channels. Our proposed solution outperforms or defines state-of-the-art for HAR and cycle analysis using inertial sensors. We achieve an overall activity F1-score of 92.6% and a phase detection F1-score of 98.2%. The gait analysis achieves a mean stride time error of 5.3 ± 51.9ms and swing duration error of 0.0 ± 5.9%. The overall step count error for all activities is −1.5 ± 2.8%. Thus, we provide a method that is not dependent on feature extraction and a model that is sensor and location independent. |
---|---|
ISSN: | 0925-2312 1872-8286 |
DOI: | 10.1016/j.neucom.2020.08.079 |