Comparative Learning for Cross-Subject Finger Movement Recognition in Three Arm Postures via Data Glove

Reliable recognition of therapeutic hand and finger movements is a prerequisite for effective home-based rehabilitation, where patients must exercise without continuous therapist supervision. Inter-subject variability, stemming from differences in hand size, joint flexibility, and movement speed lim...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on neural systems and rehabilitation engineering Vol. 33; p. 1
Main Authors Jiang, Lei, Zeng, Fengmeng, Yu, Annie
Format Journal Article
LanguageEnglish
Published United States IEEE 01.01.2025
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Reliable recognition of therapeutic hand and finger movements is a prerequisite for effective home-based rehabilitation, where patients must exercise without continuous therapist supervision. Inter-subject variability, stemming from differences in hand size, joint flexibility, and movement speed limit the generalization of data-glove models. We present CLAPISA, a contrastive-learning framework that embeds a Siamese network into a CNN-LSTM spatiotemporal pipeline for cross-subject gesture recognition. Training employs a 1: 2 positive-to-negative pairing strategy and an empirically optimized margin of 1.0, enabling the network to form subject-invariant, rehabilitation-relevant embeddings. Evaluated on a bending-sensor dataset containing twenty young adults, CLAPISA attains an average accuracy of 96.71 % under leave-one-subject-out cross-validation outperforming five baseline models and reducing errors for the most challenging subjects by up to 12.3 %. Although current validation is limited to a young cohort, the framework's data efficiency and subject-invariant design indicate strong potential for extension to elderly and neurologically impaired populations, our next work will be to collect such data for further verification.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:1534-4320
1558-0210
1558-0210
DOI:10.1109/TNSRE.2025.3583303