Autoencoder-based Approaches for Upper Limb Use Detection

Feature extraction is an essential step in traditional machine learning approaches for detecting upper limb use employing movement data from inertial measurement units. However, manual feature extraction requires domain expertise, and intra- and inter-subject movement pattern variations within and a...

Full description

Saved in:
Bibliographic Details
Published inIEEE access p. 1
Main Authors Neelakandan, Parvathy, Varadhan, SKM, Balasubramanian, Sivakumar
Format Journal Article
LanguageEnglish
Published IEEE 2025
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Feature extraction is an essential step in traditional machine learning approaches for detecting upper limb use employing movement data from inertial measurement units. However, manual feature extraction requires domain expertise, and intra- and inter-subject movement pattern variations within and across activities could affect model performance. The use of autoencoders for automatic feature extraction for upper limb use detection has not yet been investigated. To this end, this study investigated the performance of three different autoencoder-based approaches for upper limb use detection using data from 10 healthy individuals (right and left limbs) and 5 hemiparetic individuals (affected and unaffected limbs) doing a set of activities while wearing two wrist-worn inertial measurement units. A supervised autoencoder using extracted latent features for a random forest classifier resulted in performance gains over previous studies based on manual feature extraction for affected and unaffected limbs in hemiparetic patients (stroke or TBI) and the right limb in healthy participants. Given the limited dataset, further validation with a larger sample with more tasks is necessary to confirm the generalizability of these findings. Using advanced machine learning models can lead to improved quantification of upper limb use and can improve the assessment of rehabilitation outcomes with wearable technologies.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2025.3598806