Comparative Learning for Cross-Subject Finger Movement Recognition in Three Arm Postures via Data Glove

Reliable recognition of therapeutic hand and finger movements is a prerequisite for effective home-based rehabilitation, where patients must exercise without continuous therapist supervision. Inter-subject variability, stemming from differences in hand size, joint flexibility, and movement speed lim...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on neural systems and rehabilitation engineering Vol. 33; p. 1
Main Authors Jiang, Lei, Zeng, Fengmeng, Yu, Annie
Format Journal Article
LanguageEnglish
Published United States IEEE 01.01.2025
Subjects
Online AccessGet full text

Cover

Loading…
Abstract Reliable recognition of therapeutic hand and finger movements is a prerequisite for effective home-based rehabilitation, where patients must exercise without continuous therapist supervision. Inter-subject variability, stemming from differences in hand size, joint flexibility, and movement speed limit the generalization of data-glove models. We present CLAPISA, a contrastive-learning framework that embeds a Siamese network into a CNN-LSTM spatiotemporal pipeline for cross-subject gesture recognition. Training employs a 1: 2 positive-to-negative pairing strategy and an empirically optimized margin of 1.0, enabling the network to form subject-invariant, rehabilitation-relevant embeddings. Evaluated on a bending-sensor dataset containing twenty young adults, CLAPISA attains an average accuracy of 96.71 % under leave-one-subject-out cross-validation outperforming five baseline models and reducing errors for the most challenging subjects by up to 12.3 %. Although current validation is limited to a young cohort, the framework's data efficiency and subject-invariant design indicate strong potential for extension to elderly and neurologically impaired populations, our next work will be to collect such data for further verification.
AbstractList Reliable recognition of therapeutic hand and finger movements is a prerequisite for effective home-based rehabilitation, where patients must exercise without continuous therapist supervision. Inter-subject variability, stemming from differences in hand size, joint flexibility, and movement speed limit the generalization of data-glove models. We present CLAPISA, a contrastive-learning framework that embeds a Siamese network into a CNN-LSTM spatiotemporal pipeline for cross-subject gesture recognition. Training employs a 1: 2 positive-to-negative pairing strategy and an empirically optimized margin of 1.0, enabling the network to form subject-invariant, rehabilitation-relevant embeddings. Evaluated on a bending-sensor dataset containing twenty young adults, CLAPISA attains an average accuracy of 96.71 % under leave-one-subject-out cross-validation outperforming five baseline models and reducing errors for the most challenging subjects by up to 12.3 %. Although current validation is limited to a young cohort, the framework's data efficiency and subject-invariant design indicate strong potential for extension to elderly and neurologically impaired populations, our next work will be to collect such data for further verification.
Reliable recognition of therapeutic hand and finger movements is a prerequisite for effective home-based rehabilitation, where patients must exercise without continuous therapist supervision. Inter-subject variability, stemming from differences in hand size, joint flexibility, and movement speed limit the generalization of data-glove models. We present CLAPISA, a contrastive-learning framework that embeds a Siamese network into a CNN-LSTM spatiotemporal pipeline for cross-subject gesture recognition. Training employs a 1: 2 positive-to-negative pairing strategy and an empirically optimized margin of 1.0, enabling the network to form subject-invariant, rehabilitation-relevant embeddings. Evaluated on a bending-sensor dataset containing twenty young adults, CLAPISA attains an average accuracy of 96.71 % under leave-one-subject-out cross-validation outperforming five baseline models and reducing errors for the most challenging subjects by up to 12.3 %. Although current validation is limited to a young cohort, the framework's data efficiency and subject-invariant design indicate strong potential for extension to elderly and neurologically impaired populations, our next work will be to collect such data for further verification.Reliable recognition of therapeutic hand and finger movements is a prerequisite for effective home-based rehabilitation, where patients must exercise without continuous therapist supervision. Inter-subject variability, stemming from differences in hand size, joint flexibility, and movement speed limit the generalization of data-glove models. We present CLAPISA, a contrastive-learning framework that embeds a Siamese network into a CNN-LSTM spatiotemporal pipeline for cross-subject gesture recognition. Training employs a 1: 2 positive-to-negative pairing strategy and an empirically optimized margin of 1.0, enabling the network to form subject-invariant, rehabilitation-relevant embeddings. Evaluated on a bending-sensor dataset containing twenty young adults, CLAPISA attains an average accuracy of 96.71 % under leave-one-subject-out cross-validation outperforming five baseline models and reducing errors for the most challenging subjects by up to 12.3 %. Although current validation is limited to a young cohort, the framework's data efficiency and subject-invariant design indicate strong potential for extension to elderly and neurologically impaired populations, our next work will be to collect such data for further verification.
Author Yu, Annie
Zeng, Fengmeng
Jiang, Lei
Author_xml – sequence: 1
  givenname: Lei
  orcidid: 0009-0007-7825-6814
  surname: Jiang
  fullname: Jiang, Lei
  email: jianglei2@nbu.edu.cn
  organization: College of Science and Technology, Laboratory of Intelligent Home Appliances, Ningbo University, Ningbo, China
– sequence: 2
  givenname: Fengmeng
  surname: Zeng
  fullname: Zeng, Fengmeng
  email: zfmeng@zstu.edu.cn
  organization: Key Laboratory of Intelligent Textile and Flexible Interconnection of Zhejiang Province, Hangzhou, China
– sequence: 3
  givenname: Annie
  surname: Yu
  fullname: Yu, Annie
  email: annie.tw.yu@polyu.edu.hk
  organization: School of Fashion and Textiles, Hong Kong Polytechnic University, Hongkong, China
BackLink https://www.ncbi.nlm.nih.gov/pubmed/40569808$$D View this record in MEDLINE/PubMed
BookMark eNpFUcmOEzEQtdAgZoEfQAj5yKWD116Oo8zCSGHRTDhbZaccHHW3g90dib-nexKGUy16i6reJTnrY4-EvOdswTlrPq-_PT3eLgQTeiF1LSWTr8gF17oumODsbO6lKpQU7Jxc5rxjjFelrt6Qc8V02dSsviDbZez2kGAIB6QrhNSHfkt9THSZYs7F02h36AZ6N60x0a_xgB32A31EF7d9GELsaejp-ldCpNepoz9iHsaEmR4C0BsYgN63E-ktee2hzfjuVK_Iz7vb9fJLsfp-_7C8XhVO1tVQWKtrX3GvtQeF1itbOq65g1I74bHUU6krVMJJazd2mgA5Trc0jgFrmLwiD0fdTYSd2afQQfpjIgTzvIhpayANwbVopOQ1cKfFpKYq8KCdV35jwTXcyUZPWp-OWvsUf4-YB9OF7LBtocc4ZiOFUKVshJptP56go-1w82L879ETQBwBbn5rQv8C4czMaZrnNM2cpjmlOZE-HEkBEf8TONOirpn8C8V5m5Q
CODEN ITNSB3
Cites_doi 10.1038/s41598-023-36490-w
10.5555/3524938.3525087
10.1038/s41597-022-01484-2
10.1109/CVPR.2015.7298683
10.1016/S1474-4422(09)70150-4
10.1109/TNSRE.2019.2896269
10.1109/ICORR.2019.8779533
10.1145/2968219.2971403
10.1016/j.clinbiomech.2005.01.002
10.1038/s41598-019-44896-8
10.1016/j.compbiomed.2018.08.020
10.1109/TNSRE.2022.3156387
10.1155/2017/5090454
10.1109/JSEN.2022.3167696
10.1016/j.engappai.2023.107251
10.1016/j.neunet.2018.07.011
10.1109/ICNLP55136.2022.00079
10.20965/jaciii.2022.p0113
10.1109/ROBIO.2018.8664807
10.1109/SoSE50414.2020.9130489
10.1016/j.mejo.2018.01.014
10.1109/TSMCC.2008.923862
10.1109/ICIP49359.2023.10222870
10.1109/CcS49175.2020.9231515
10.1109/CVPRW.2016.153
10.1016/j.cmpb.2016.02.012
10.1016/j.bbe.2022.02.005
10.1007/s40846-019-00491-w
10.35940/ijrte.D6801.118419
10.1109/JSEN.2020.3014276
10.1109/WiSPNET48689.2020.9198521
10.1109/LRA.2021.3089999
10.1109/IJCB54206.2022.10008005
10.1109/ICCC57788.2023.10233557
10.1109/TRO.2012.2226386
10.32604/cmc.2022.019586
10.3390/s22041321
10.1109/IJCNN60899.2024.10650863
10.1177/1545968319868716
10.1109/TBME.2022.3140269
10.1007/s11633-022-1386-4
10.3233/NRE-172412
10.1109/ICCVW60793.2023.00024
10.1016/S0021-9290(02)00229-4
10.1016/j.eswa.2023.121055
10.1109/ACCESS.2023.3235368
10.1109/TNSRE.2016.2626800
10.1109/JSEN.2020.3011825
10.1109/LRA.2022.3169448
10.1016/0021-9290(94)90023-X
ContentType Journal Article
DBID 97E
ESBDL
RIA
RIE
AAYXX
CITATION
CGR
CUY
CVF
ECM
EIF
NPM
7X8
DOA
DOI 10.1109/TNSRE.2025.3583303
DatabaseName IEEE All-Society Periodicals Package (ASPP) 2005–Present
IEEE Xplore Open Access Journals
IEEE All-Society Periodicals Package (ASPP) 1998–Present
IEEE/IET Electronic Library
CrossRef
Medline
MEDLINE
MEDLINE (Ovid)
MEDLINE
MEDLINE
PubMed
MEDLINE - Academic
DOAJ Directory of Open Access Journals
DatabaseTitle CrossRef
MEDLINE
Medline Complete
MEDLINE with Full Text
PubMed
MEDLINE (Ovid)
MEDLINE - Academic
DatabaseTitleList
MEDLINE
MEDLINE - Academic

Database_xml – sequence: 1
  dbid: DOA
  name: DOAJ Directory of Open Access Journals
  url: https://www.doaj.org/
  sourceTypes: Open Website
– sequence: 2
  dbid: NPM
  name: PubMed
  url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed
  sourceTypes: Index Database
– sequence: 3
  dbid: EIF
  name: MEDLINE
  url: https://proxy.k.utb.cz/login?url=https://www.webofscience.com/wos/medline/basic-search
  sourceTypes: Index Database
– sequence: 4
  dbid: RIE
  name: IEEE/IET Electronic Library
  url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/
  sourceTypes: Publisher
DeliveryMethod fulltext_linktorsrc
Discipline Occupational Therapy & Rehabilitation
EISSN 1558-0210
EndPage 1
ExternalDocumentID oai_doaj_org_article_3318a1c5242c47afa5cf4fdbac91c395
40569808
10_1109_TNSRE_2025_3583303
11052880
Genre orig-research
Research Support, Non-U.S. Gov't
Journal Article
Comparative Study
GrantInformation_xml – fundername: The Key Laboratory of Intelligent Textile and Flexible Interconnection
  grantid: YB16
– fundername: China Postdoctoral Science Foundation; The China Postdoctoral Science Foundation
  grantid: 2024M750518
  funderid: 10.13039/501100002858
GroupedDBID ---
-~X
0R~
29I
4.4
5GY
6IK
97E
AAFWJ
AAJGR
AASAJ
AAWTH
ABAZT
ABVLG
ACGFO
ACGFS
ACIWK
ACPRK
AENEX
AFPKN
AFRAH
ALMA_UNASSIGNED_HOLDINGS
BEFXN
BFFAM
BGNUA
BKEBE
BPEOZ
CS3
DU5
EBS
ESBDL
F5P
GROUPED_DOAJ
HZ~
IFIPE
IPLJI
JAVBF
LAI
O9-
OCL
OK1
P2P
RIA
RIE
RNS
53G
5VS
AAYXX
AETIX
AGSQL
AIBXA
CITATION
EJD
H~9
M43
RIG
CGR
CUY
CVF
ECM
EIF
NPM
7X8
ID FETCH-LOGICAL-c387t-bb58f71f55fa4ebf4b6c151ca65c2fe655c287e42c3bbdb5c2ae1e6989c0a0903
IEDL.DBID DOA
ISSN 1534-4320
1558-0210
IngestDate Wed Aug 27 01:27:36 EDT 2025
Fri Jul 11 16:58:51 EDT 2025
Sat Aug 02 01:40:57 EDT 2025
Thu Jul 10 07:49:10 EDT 2025
Wed Aug 27 02:14:34 EDT 2025
IsDoiOpenAccess true
IsOpenAccess true
IsPeerReviewed true
IsScholarly true
Language English
License https://creativecommons.org/licenses/by-nc-nd/4.0
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c387t-bb58f71f55fa4ebf4b6c151ca65c2fe655c287e42c3bbdb5c2ae1e6989c0a0903
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ORCID 0009-0007-7825-6814
0000-0002-3413-1880
OpenAccessLink https://doaj.org/article/3318a1c5242c47afa5cf4fdbac91c395
PMID 40569808
PQID 3224639240
PQPubID 23479
PageCount 1
ParticipantIDs ieee_primary_11052880
crossref_primary_10_1109_TNSRE_2025_3583303
pubmed_primary_40569808
doaj_primary_oai_doaj_org_article_3318a1c5242c47afa5cf4fdbac91c395
proquest_miscellaneous_3224639240
PublicationCentury 2000
PublicationDate 2025-01-01
PublicationDateYYYYMMDD 2025-01-01
PublicationDate_xml – month: 01
  year: 2025
  text: 2025-01-01
  day: 01
PublicationDecade 2020
PublicationPlace United States
PublicationPlace_xml – name: United States
PublicationTitle IEEE transactions on neural systems and rehabilitation engineering
PublicationTitleAbbrev TNSRE
PublicationTitleAlternate IEEE Trans Neural Syst Rehabil Eng
PublicationYear 2025
Publisher IEEE
Publisher_xml – name: IEEE
References ref13
ref12
ref15
ref14
ref11
ref10
ref17
ref16
ref19
ref18
ref51
ref50
ref46
ref45
ref48
ref47
ref42
ref41
ref44
ref43
ref49
ref8
ref7
ref9
ref4
ref3
ref5
ref40
ref35
ref34
ref37
ref36
ref31
ref30
ref33
ref32
ref2
ref1
ref39
ref38
ref24
ref23
ref26
Ariesta (ref6) 2018; 26
ref25
ref20
ref22
ref21
ref28
ref27
ref29
References_xml – ident: ref32
  doi: 10.1038/s41598-023-36490-w
– ident: ref37
  doi: 10.5555/3524938.3525087
– ident: ref45
  doi: 10.1038/s41597-022-01484-2
– ident: ref50
  doi: 10.1109/CVPR.2015.7298683
– ident: ref2
  doi: 10.1016/S1474-4422(09)70150-4
– ident: ref34
  doi: 10.1109/TNSRE.2019.2896269
– ident: ref22
  doi: 10.1109/ICORR.2019.8779533
– ident: ref15
  doi: 10.1145/2968219.2971403
– ident: ref19
  doi: 10.1016/j.clinbiomech.2005.01.002
– ident: ref16
  doi: 10.1038/s41598-019-44896-8
– ident: ref10
  doi: 10.1016/j.compbiomed.2018.08.020
– ident: ref9
  doi: 10.1109/TNSRE.2022.3156387
– ident: ref8
  doi: 10.1155/2017/5090454
– ident: ref13
  doi: 10.1109/JSEN.2022.3167696
– ident: ref46
  doi: 10.1016/j.engappai.2023.107251
– ident: ref47
  doi: 10.1016/j.neunet.2018.07.011
– ident: ref40
  doi: 10.1109/ICNLP55136.2022.00079
– ident: ref25
  doi: 10.20965/jaciii.2022.p0113
– ident: ref24
  doi: 10.1109/ROBIO.2018.8664807
– ident: ref17
  doi: 10.1109/SoSE50414.2020.9130489
– ident: ref1
  doi: 10.1016/j.mejo.2018.01.014
– ident: ref12
  doi: 10.1109/TSMCC.2008.923862
– ident: ref42
  doi: 10.1109/ICIP49359.2023.10222870
– ident: ref26
  doi: 10.1109/CcS49175.2020.9231515
– ident: ref51
  doi: 10.1109/CVPRW.2016.153
– ident: ref5
  doi: 10.1016/j.cmpb.2016.02.012
– ident: ref28
  doi: 10.1016/j.bbe.2022.02.005
– ident: ref21
  doi: 10.1007/s40846-019-00491-w
– ident: ref23
  doi: 10.35940/ijrte.D6801.118419
– ident: ref27
  doi: 10.1109/JSEN.2020.3014276
– ident: ref38
  doi: 10.1109/WiSPNET48689.2020.9198521
– ident: ref31
  doi: 10.1109/LRA.2021.3089999
– ident: ref43
  doi: 10.1109/IJCB54206.2022.10008005
– ident: ref44
  doi: 10.1109/ICCC57788.2023.10233557
– ident: ref33
  doi: 10.1109/TRO.2012.2226386
– volume: 26
  start-page: 1659
  issue: 4
  year: 2018
  ident: ref6
  article-title: A survey of hand gesture recognition methods in sign language recognition
  publication-title: Pertanika J. Sci. Technol.
– ident: ref30
  doi: 10.32604/cmc.2022.019586
– ident: ref35
  doi: 10.3390/s22041321
– ident: ref48
  doi: 10.1109/IJCNN60899.2024.10650863
– ident: ref4
  doi: 10.1177/1545968319868716
– ident: ref11
  doi: 10.1109/TBME.2022.3140269
– ident: ref41
  doi: 10.1007/s11633-022-1386-4
– ident: ref3
  doi: 10.3233/NRE-172412
– ident: ref39
  doi: 10.1109/ICCVW60793.2023.00024
– ident: ref20
  doi: 10.1016/S0021-9290(02)00229-4
– ident: ref36
  doi: 10.1016/j.eswa.2023.121055
– ident: ref49
  doi: 10.1109/ACCESS.2023.3235368
– ident: ref7
  doi: 10.1109/TNSRE.2016.2626800
– ident: ref14
  doi: 10.1109/JSEN.2020.3011825
– ident: ref29
  doi: 10.1109/LRA.2022.3169448
– ident: ref18
  doi: 10.1016/0021-9290(94)90023-X
SSID ssj0017657
Score 2.4386022
Snippet Reliable recognition of therapeutic hand and finger movements is a prerequisite for effective home-based rehabilitation, where patients must exercise without...
SourceID doaj
proquest
pubmed
crossref
ieee
SourceType Open Website
Aggregation Database
Index Database
Publisher
StartPage 1
SubjectTerms Accuracy
Adaptation models
Adult
Algorithms
Arm - physiology
Biomedical monitoring
comparative learning
Contrastive learning
cross-subject
data glove
Data gloves
Data models
Feature extraction
Female
finger movement recognition
Fingers - physiology
Gesture recognition
Gestures
Hands
Humans
Indexes
Machine Learning
Male
Movement - physiology
Neural Networks, Computer
Pattern Recognition, Automated - methods
Posture - physiology
Reproducibility of Results
Siamese network
Young Adult
SummonAdditionalLinks – databaseName: IEEE/IET Electronic Library
  dbid: RIE
  link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1Lb9QwELZoT1ygQIEtDxkJuKBss37kcYSlqwqpe1i2Um-WH-OqQiSoJBz49cw4ybYgVeKUh5zY1oztb8aebxh7W_gaZ8YAmYLcZ0oGkdWy8llhoy4jyAApZ-TZujg9V18u9MUYrJ5iYQAgHT6DOd2mvfzQ-p5cZce4VGmBCrfH9tByG4K1dlsGZZFoPXEEK6xT5FOETF4fb9dfNydoCwo9lxRllFP2HEQqRV1RWslbC1Li7R8TrdyNOdPas3rI1lOrhyMn3-Z95-b-9z-Ejv_drQP2YESh_OOgNo_YPWges3e3GYf5dqAb4O_55i8y7yfscnnDGM5HftZLjuCXL6mHGc5F5Nzhq-Qx5GdtoiTv-GY6q9Q2_KrBCq4BsAnfOSUM7tHs57-uLP9sO8vTydJDdr462S5PszFfQ-ZlVXaZc7qK5SJqHa0CF5UrPAIKbwvtRYRC46UqQQkvnQsOnywsgDJY-tySv-gp22_aBp4zDgGBp7Ci0lYoD2VVawH404BwzQa3mLEPk9DMj4GWwyRzJq9NkrYhaZtR2jP2ieS6K0mU2ukFisGMI9RInN3swmvELF6VNlrto4rBWdRmL2s9Y4ckupvqRqnN2JtJTQwOTdpvsQ20_U8jiawP8afCMs8G_dl9PWnf0R1_fcHuUw8GZ89Ltt9d9_AK4U_nXie1_wMbaAB7
  priority: 102
  providerName: IEEE
Title Comparative Learning for Cross-Subject Finger Movement Recognition in Three Arm Postures via Data Glove
URI https://ieeexplore.ieee.org/document/11052880
https://www.ncbi.nlm.nih.gov/pubmed/40569808
https://www.proquest.com/docview/3224639240
https://doaj.org/article/3318a1c5242c47afa5cf4fdbac91c395
Volume 33
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV07T8MwELYQEwviTXnJSMCCAqkfeYxQWlVI7VCKxGbZzhkxkCJI-f2cnaSFAbEwRYmc-Oy7-L7z4ztCzhKb48hYQCQgtpHgBYtyntko0U6mDngBIWfkaJwMH8X9k3z6lurL7wmr6YHrjrvmaHS6ayW6EitS7bS0TrjCaKzE8jywl6LPa4OpZv0gTQLHJ_7OAgVgcXtcJs6vp-OHSR8DQyavuD9y1KbLalxSYO5vUq38jjqD9xlskPUGNtKbWtxNsgLlFjn_ThFMpzU_AL2gkx_s29vkubek-KYNoeozRbRKe16gCAcPPxtDB2GKj45mgUO8opN2c9GspC8lVvAOgCK8Up_hd45xOv180fROV5qGraA75HHQn_aGUZNgIbI8S6vIGJm5tOukdFqAccIkFhGA1Ym0zEEi8ZKlgD3PjSkM3mnogk85aWPtJ3h2yWo5K2GfUCgQKTLNMqmZsJBmuWSAHy0QX-nCdDvksu1j9VbzaKgQf8S5ChpRXiOq0UiH3Ho1LEp6DuzwAC1DNZah_rKMDtnxSlxWh1CS4XDVIaetVhX-S36BRJcwm38o7tn1EDAKLLNXq3vxNgJbbHicHfyHaIdkzTe3nso5IqvV-xyOEdxU5iTY8Uk4h_gFTNP2eA
linkProvider Directory of Open Access Journals
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1Lb9QwEB5BOcCFZ4HlaSTggrLNOnYeR1i6WqC7h2Ur9Wb5Ma4qRIJKwoFfz9hJtgWpEqc85MS2Zmx_Hs98A_A6txXNjA4TgalNROZ4UmWlTXLtZeExcxhzRq7W-fJYfD6RJ0OweoyFQcTofIbTcBvP8l1ju2AqO6ClSnJSuOtwgxZ-yftwrd2hQZFHYk8aw4Jq5ekYI5NWB9v1180h7Qa5nGYhzigN-XMIq-RVGRJLXlqSInP_kGrlatQZV5_FHViP7e6dTr5Nu9ZM7e9_KB3_u2N34faAQ9n7XnHuwTWs78Oby5zDbNsTDrC3bPMXnfcDOJ1fcIazgaH1lBH8ZfPQw4Rmo2DeYYtoM2SrJpKSt2wzeis1NTurqYJzRGrCdxZSBne08We_zjT7qFvNom_pPhwvDrfzZTJkbEhsVhZtYowsfTHzUnot0HhhckuQwupcWu4xl3QpCxTcZsY4Q08aZxhyWNpUB4vRQ9irmxofA0NH0JNrXkrNhcWirCRH-qkjwKadmU3g3Sg09aMn5lBxQ5NWKkpbBWmrQdoT-BDkuisZSLXjCxKDGsaoymh-0zMrCbVYUWivpfXCO6NJn21WyQnsB9FdVDdIbQKvRjVRNDjDiYuusel-qizQ9RECFVTmUa8_u69H7XtyxV9fws3ldnWkjj6tvzyFW6E3vennGey15x0-JzDUmhdxCPwBoDADxQ
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Comparative+Learning+for+Cross-Subject+Finger+Movement+Recognition+in+Three+Arm+Postures+via+Data+Glove&rft.jtitle=IEEE+transactions+on+neural+systems+and+rehabilitation+engineering&rft.au=Jiang%2C+Lei&rft.au=Zeng%2C+Fengmeng&rft.au=Yu%2C+Annie&rft.date=2025-01-01&rft.eissn=1558-0210&rft.volume=33&rft.spage=2531&rft_id=info:doi/10.1109%2FTNSRE.2025.3583303&rft_id=info%3Apmid%2F40569808&rft.externalDocID=40569808
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1534-4320&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1534-4320&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1534-4320&client=summon