A Deep Learning Framework for Assessing Physical Rehabilitation Exercises

Computer-aided assessment of physical rehabilitation entails evaluation of patient performance in completing prescribed rehabilitation exercises, based on processing movement data captured with a sensory system. Despite the essential role of rehabilitation assessment toward improved patient outcomes...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on neural systems and rehabilitation engineering Vol. 28; no. 2; pp. 468 - 477
Main Authors Liao, Yalin, Vakanski, Aleksandar, Xian, Min
Format Journal Article
LanguageEnglish
Published United States IEEE 01.02.2020
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text
ISSN1534-4320
1558-0210
1558-0210
DOI10.1109/TNSRE.2020.2966249

Cover

Loading…
Abstract Computer-aided assessment of physical rehabilitation entails evaluation of patient performance in completing prescribed rehabilitation exercises, based on processing movement data captured with a sensory system. Despite the essential role of rehabilitation assessment toward improved patient outcomes and reduced healthcare costs, existing approaches lack versatility, robustness, and practical relevance. In this paper, we propose a deep learning-based framework for automated assessment of the quality of physical rehabilitation exercises. The main components of the framework are metrics for quantifying movement performance, scoring functions for mapping the performance metrics into numerical scores of movement quality, and deep neural network models for generating quality scores of input movements via supervised learning. The proposed performance metric is defined based on the log-likelihood of a Gaussian mixture model, and encodes low-dimensional data representation obtained with a deep autoencoder network. The proposed deep spatio-temporal neural network arranges data into temporal pyramids, and exploits the spatial characteristics of human movements by using sub-networks to process joint displacements of individual body parts. The presented framework is validated using a dataset of ten rehabilitation exercises. The significance of this work is that it is the first that implements deep neural networks for assessment of rehabilitation performance.
AbstractList Computer-aided assessment of physical rehabilitation entails evaluation of patient performance in completing prescribed rehabilitation exercises, based on processing movement data captured with a sensory system. Despite the essential role of rehabilitation assessment toward improved patient outcomes and reduced healthcare costs, existing approaches lack versatility, robustness, and practical relevance. In this paper, we propose a deep learning-based framework for automated assessment of the quality of physical rehabilitation exercises. The main components of the framework are metrics for quantifying movement performance, scoring functions for mapping the performance metrics into numerical scores of movement quality, and deep neural network models for generating quality scores of input movements via supervised learning. The proposed performance metric is defined based on the log-likelihood of a Gaussian mixture model, and encodes low-dimensional data representation obtained with a deep autoencoder network. The proposed deep spatio-temporal neural network arranges data into temporal pyramids, and exploits the spatial characteristics of human movements by using sub-networks to process joint displacements of individual body parts. The presented framework is validated using a dataset of ten rehabilitation exercises. The significance of this work is that it is the first that implements deep neural networks for assessment of rehabilitation performance.
Computer-aided assessment of physical rehabilitation entails evaluation of patient performance in completing prescribed rehabilitation exercises, based on processing movement data captured with a sensory system. Despite the essential role of rehabilitation assessment toward improved patient outcomes and reduced healthcare costs, existing approaches lack versatility, robustness, and practical relevance. In this paper, we propose a deep learning-based framework for automated assessment of the quality of physical rehabilitation exercises. The main components of the framework are metrics for quantifying movement performance, scoring functions for mapping the performance metrics into numerical scores of movement quality, and deep neural network models for generating quality scores of input movements via supervised learning. The proposed performance metric is defined based on the log-likelihood of a Gaussian mixture model, and encodes low-dimensional data representation obtained with a deep autoencoder network. The proposed deep spatio-temporal neural network arranges data into temporal pyramids, and exploits the spatial characteristics of human movements by using sub-networks to process joint displacements of individual body parts. The presented framework is validated using a dataset of ten rehabilitation exercises. The significance of this work is that it is the first that implements deep neural networks for assessment of rehabilitation performance.Computer-aided assessment of physical rehabilitation entails evaluation of patient performance in completing prescribed rehabilitation exercises, based on processing movement data captured with a sensory system. Despite the essential role of rehabilitation assessment toward improved patient outcomes and reduced healthcare costs, existing approaches lack versatility, robustness, and practical relevance. In this paper, we propose a deep learning-based framework for automated assessment of the quality of physical rehabilitation exercises. The main components of the framework are metrics for quantifying movement performance, scoring functions for mapping the performance metrics into numerical scores of movement quality, and deep neural network models for generating quality scores of input movements via supervised learning. The proposed performance metric is defined based on the log-likelihood of a Gaussian mixture model, and encodes low-dimensional data representation obtained with a deep autoencoder network. The proposed deep spatio-temporal neural network arranges data into temporal pyramids, and exploits the spatial characteristics of human movements by using sub-networks to process joint displacements of individual body parts. The presented framework is validated using a dataset of ten rehabilitation exercises. The significance of this work is that it is the first that implements deep neural networks for assessment of rehabilitation performance.
Author Liao, Yalin
Vakanski, Aleksandar
Xian, Min
Author_xml – sequence: 1
  givenname: Yalin
  surname: Liao
  fullname: Liao, Yalin
  email: liao4728@vandals.uidaho.edu
  organization: Department of Computer Science, University of Idaho, Idaho Falls, ID, USA
– sequence: 2
  givenname: Aleksandar
  orcidid: 0000-0003-3365-1291
  surname: Vakanski
  fullname: Vakanski, Aleksandar
  email: vakanski@uidaho.edu
  organization: Department of Computer Science, University of Idaho, Idaho Falls, ID, USA
– sequence: 3
  givenname: Min
  orcidid: 0000-0001-6098-4441
  surname: Xian
  fullname: Xian, Min
  email: mxian@uidaho.edu
  organization: Department of Computer Science, University of Idaho, Idaho Falls, ID, USA
BackLink https://www.ncbi.nlm.nih.gov/pubmed/31940544$$D View this record in MEDLINE/PubMed
BookMark eNp9kc1uEzEUhS1URH_gBUBCI7HpZlL72jO2N0hRm0KlqKBS1pbHc6dxmYyDPWnp2-OQtIIuWF3L_s7R8T2HZG8IAxLyltEJY1SfXF9-u5pNgAKdgK5rEPoFOWBVpUoKjO5tzlyUggPdJ4cp3VLKZF3JV2SfMy1oJcQBuZgWZ4irYo42Dn64Kc6jXeJ9iD-KLsRimhKmtLn_unhI3tm-uMKFbXzvRzv6MBSzXxidz9Rr8rKzfcI3u3lEvp_Prk8_l_Mvny5Op_PSCQlj2TJA5I3jAlwNumpUU3OmsHGVaGuRh-xkjg9MtTUViusup5XQiqa1jZL8iHzc-q7WzRJbh8MYbW9W0S9tfDDBevPvy-AX5ibcGUk5aC2ywfHOIIafa0yjWfrksO_tgGGdDHCupWYAkNEPz9DbsI5D_l6mKgkgOFeZev93oqcoj1vOgNoCLoaUInbG7daXA_reMGo2hZo_hZpNoWZXaJbCM-mj-39F77Yij4hPAqUrWVHgvwGh9qtm
CODEN ITNSB3
CitedBy_id crossref_primary_10_1109_TNSRE_2023_3282675
crossref_primary_10_1016_j_compbiomed_2024_108382
crossref_primary_10_1016_j_patcog_2021_108095
crossref_primary_10_1109_TCSVT_2022_3143549
crossref_primary_10_1016_j_compbiomed_2024_109399
crossref_primary_10_1016_j_cviu_2024_104228
crossref_primary_10_1080_10255842_2022_2122820
crossref_primary_10_1142_S0219519423400894
crossref_primary_10_1016_j_compbiomed_2020_103687
crossref_primary_10_1016_j_jbi_2022_104077
crossref_primary_10_1007_s12652_022_03717_z
crossref_primary_10_1016_j_compbiomed_2023_106835
crossref_primary_10_3390_rs14164110
crossref_primary_10_3390_mti6070048
crossref_primary_10_1007_s10459_024_10369_5
crossref_primary_10_1109_ACCESS_2021_3131613
crossref_primary_10_1109_TCDS_2021_3098350
crossref_primary_10_1186_s12911_025_02877_0
crossref_primary_10_1007_s44163_024_00130_7
crossref_primary_10_1109_TIP_2022_3166278
crossref_primary_10_3390_s21051669
crossref_primary_10_1007_s11042_024_19092_0
crossref_primary_10_1109_TNSRE_2021_3092199
crossref_primary_10_3390_robotics13030049
crossref_primary_10_1016_j_mlwa_2023_100499
crossref_primary_10_57197_JDR_2023_0065
crossref_primary_10_3389_fnins_2024_1388742
crossref_primary_10_1109_ACCESS_2020_3005189
crossref_primary_10_3390_electronics12061464
crossref_primary_10_3390_healthcare11040507
crossref_primary_10_1016_j_bspc_2022_104364
crossref_primary_10_1016_j_slast_2024_100181
crossref_primary_10_3390_s23010005
crossref_primary_10_33571_rpolitec_v17n34a11
crossref_primary_10_1016_j_isci_2023_108705
crossref_primary_10_1016_j_cviu_2024_104275
crossref_primary_10_1051_itmconf_20257002025
crossref_primary_10_1016_j_bspc_2021_103323
crossref_primary_10_1109_ACCESS_2025_3526710
crossref_primary_10_3390_s20226451
crossref_primary_10_1016_j_heliyon_2024_e32679
crossref_primary_10_1016_j_bspc_2023_105701
crossref_primary_10_3389_fnins_2023_1219556
crossref_primary_10_1155_bmri_9554590
crossref_primary_10_1016_j_micpro_2020_103791
crossref_primary_10_1109_TMM_2021_3075025
crossref_primary_10_1038_s41598_024_68204_1
crossref_primary_10_1109_JIOT_2022_3201005
crossref_primary_10_1016_j_knosys_2021_107388
crossref_primary_10_1109_JIOT_2024_3359209
crossref_primary_10_1109_JTEHM_2022_3213348
crossref_primary_10_3390_s23020778
crossref_primary_10_1007_s10489_023_04613_5
crossref_primary_10_1007_s10489_022_03984_5
crossref_primary_10_1109_TIM_2024_3373058
crossref_primary_10_1016_j_compbiomed_2021_104316
crossref_primary_10_1016_j_compbiomed_2024_109578
crossref_primary_10_1109_TPAMI_2024_3378753
crossref_primary_10_3390_en15228491
crossref_primary_10_3390_s25051586
crossref_primary_10_1186_s12911_022_01907_5
crossref_primary_10_1016_j_smhl_2024_100516
crossref_primary_10_1007_s11517_024_03177_x
crossref_primary_10_1109_TNSRE_2022_3150392
crossref_primary_10_3390_jsan12050070
crossref_primary_10_3390_jpm13050874
crossref_primary_10_1080_24751839_2025_2454053
crossref_primary_10_1186_s12984_021_00957_6
crossref_primary_10_1109_LES_2023_3289810
crossref_primary_10_1016_j_imu_2023_101327
crossref_primary_10_1016_j_compbiomed_2023_107420
crossref_primary_10_1109_ACCESS_2024_3491862
crossref_primary_10_3389_fphys_2024_1472380
crossref_primary_10_1142_S0219519425400019
crossref_primary_10_3390_s19194129
crossref_primary_10_1109_TNSRE_2022_3219085
crossref_primary_10_3390_s20185258
crossref_primary_10_1109_LRA_2021_3084889
crossref_primary_10_1109_TNSRE_2023_3317411
crossref_primary_10_1038_s41746_024_00998_w
crossref_primary_10_3390_s22031170
crossref_primary_10_3390_computation10100175
crossref_primary_10_1016_j_jvcir_2022_103625
crossref_primary_10_1016_j_heliyon_2024_e27596
crossref_primary_10_1109_TNSRE_2024_3523906
crossref_primary_10_1109_THMS_2021_3059716
crossref_primary_10_1186_s12984_024_01318_9
crossref_primary_10_1109_TMM_2022_3222681
Cites_doi 10.1186/s12883-017-0888-0
10.2522/ptj.20100343
10.1109/TIM.2012.2236792
10.1111/j.2517-6161.1977.tb01600.x
10.1109/CVPR.2016.119
10.3414/ME13-01-0109
10.4172/2329-9096.1000214
10.1504/IJSNET.2019.099230
10.3390/data3010002
10.1109/TNSRE.2014.2326254
10.1109/TSMCB.2012.2185694
10.1109/JSYST.2014.2327792
10.1109/TNSRE.2019.2923060
10.1007/978-3-642-25446-8_4
10.1109/ICIP.2005.1530635
10.1186/1743-0003-11-3
10.1016/j.asoc.2014.04.020
10.1080/10749357.2016.1200292
10.1109/ICCV.2017.317
10.1109/TBME.2015.2477095
10.1109/ISCAS.2018.8350967
10.24963/ijcai.2018/109
10.1109/TASSP.1978.1163055
10.2522/ptj.20060260
10.3390/s16010115
10.1109/ICCV.2015.494
10.1109/TRO.2006.886270
10.1145/1460096.1460144
10.1016/j.math.2009.12.004
10.1016/j.jbi.2017.12.012
10.1109/JBHI.2015.2431472
10.1007/BF00332918
10.1109/CVPR.2017.173
10.1109/CVPR.2016.115
10.1007/978-3-642-40728-4_48
10.4172/2329-9096.1000403
10.1109/TBME.2005.856295
10.1109/CVPR.2016.573
10.1109/TNSRE.2013.2259640
10.1109/THMS.2015.2493536
10.1109/ISBB.2011.6107680
ContentType Journal Article
Copyright Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2020
Copyright_xml – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2020
DBID 97E
RIA
RIE
AAYXX
CITATION
CGR
CUY
CVF
ECM
EIF
NPM
7QF
7QO
7QQ
7SC
7SE
7SP
7SR
7TA
7TB
7TK
7U5
8BQ
8FD
F28
FR3
H8D
JG9
JQ2
KR7
L7M
L~C
L~D
NAPCQ
P64
7X8
5PM
DOI 10.1109/TNSRE.2020.2966249
DatabaseName IEEE All-Society Periodicals Package (ASPP) 2005-present
IEEE All-Society Periodicals Package (ASPP) 1998-Present
IEEE Electronic Library (IEL)
CrossRef
Medline
MEDLINE
MEDLINE (Ovid)
MEDLINE
MEDLINE
PubMed
Aluminium Industry Abstracts
Biotechnology Research Abstracts
Ceramic Abstracts
Computer and Information Systems Abstracts
Corrosion Abstracts
Electronics & Communications Abstracts
Engineered Materials Abstracts
Materials Business File
Mechanical & Transportation Engineering Abstracts
Neurosciences Abstracts
Solid State and Superconductivity Abstracts
METADEX
Technology Research Database
ANTE: Abstracts in New Technology & Engineering
Engineering Research Database
Aerospace Database
Materials Research Database
ProQuest Computer Science Collection
Civil Engineering Abstracts
Advanced Technologies Database with Aerospace
Computer and Information Systems Abstracts – Academic
Computer and Information Systems Abstracts Professional
Nursing & Allied Health Premium
Biotechnology and BioEngineering Abstracts
MEDLINE - Academic
PubMed Central (Full Participant titles)
DatabaseTitle CrossRef
MEDLINE
Medline Complete
MEDLINE with Full Text
PubMed
MEDLINE (Ovid)
Materials Research Database
Civil Engineering Abstracts
Aluminium Industry Abstracts
Technology Research Database
Computer and Information Systems Abstracts – Academic
Mechanical & Transportation Engineering Abstracts
Electronics & Communications Abstracts
ProQuest Computer Science Collection
Computer and Information Systems Abstracts
Ceramic Abstracts
Neurosciences Abstracts
Materials Business File
METADEX
Biotechnology and BioEngineering Abstracts
Computer and Information Systems Abstracts Professional
Aerospace Database
Nursing & Allied Health Premium
Engineered Materials Abstracts
Biotechnology Research Abstracts
Solid State and Superconductivity Abstracts
Engineering Research Database
Corrosion Abstracts
Advanced Technologies Database with Aerospace
ANTE: Abstracts in New Technology & Engineering
MEDLINE - Academic
DatabaseTitleList Materials Research Database


MEDLINE
MEDLINE - Academic
Database_xml – sequence: 1
  dbid: NPM
  name: PubMed
  url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed
  sourceTypes: Index Database
– sequence: 2
  dbid: EIF
  name: MEDLINE
  url: https://proxy.k.utb.cz/login?url=https://www.webofscience.com/wos/medline/basic-search
  sourceTypes: Index Database
– sequence: 3
  dbid: RIE
  name: IEEE Electronic Library (IEL)
  url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/
  sourceTypes: Publisher
DeliveryMethod fulltext_linktorsrc
Discipline Occupational Therapy & Rehabilitation
EISSN 1558-0210
EndPage 477
ExternalDocumentID PMC7032994
31940544
10_1109_TNSRE_2020_2966249
8957502
Genre orig-research
Journal Article
Research Support, N.I.H., Extramural
GrantInformation_xml – fundername: University of Idaho
  grantid: P20GM104420
  funderid: 10.13039/100012326
– fundername: NIGMS NIH HHS
  grantid: P20 GM104420
GroupedDBID ---
-~X
0R~
29I
4.4
53G
5GY
5VS
6IK
97E
AAFWJ
AAJGR
AASAJ
AAWTH
ABAZT
ABVLG
ACGFO
ACGFS
ACIWK
ACPRK
AENEX
AETIX
AFPKN
AFRAH
AGSQL
AIBXA
ALMA_UNASSIGNED_HOLDINGS
BEFXN
BFFAM
BGNUA
BKEBE
BPEOZ
CS3
DU5
EBS
EJD
ESBDL
F5P
GROUPED_DOAJ
HZ~
H~9
IFIPE
IPLJI
JAVBF
LAI
M43
O9-
OCL
OK1
P2P
RIA
RIE
RNS
AAYXX
CITATION
RIG
CGR
CUY
CVF
ECM
EIF
NPM
7QF
7QO
7QQ
7SC
7SE
7SP
7SR
7TA
7TB
7TK
7U5
8BQ
8FD
F28
FR3
H8D
JG9
JQ2
KR7
L7M
L~C
L~D
NAPCQ
P64
7X8
5PM
ID FETCH-LOGICAL-c472t-d12ee3bc342c6295b8b6318ebc54d64bc57f7210218d604839f31972d4bdab873
IEDL.DBID RIE
ISSN 1534-4320
1558-0210
IngestDate Thu Aug 21 18:42:30 EDT 2025
Fri Jul 11 11:51:36 EDT 2025
Fri Jul 25 06:17:02 EDT 2025
Wed Feb 19 02:29:29 EST 2025
Tue Jul 01 00:43:20 EDT 2025
Thu Apr 24 23:09:09 EDT 2025
Wed Aug 27 02:51:14 EDT 2025
IsPeerReviewed true
IsScholarly true
Issue 2
Language English
License https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html
https://doi.org/10.15223/policy-029
https://doi.org/10.15223/policy-037
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c472t-d12ee3bc342c6295b8b6318ebc54d64bc57f7210218d604839f31972d4bdab873
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ORCID 0000-0001-6098-4441
0000-0003-3365-1291
PMID 31940544
PQID 2357224338
PQPubID 85423
PageCount 10
ParticipantIDs proquest_miscellaneous_2339791222
proquest_journals_2357224338
pubmedcentral_primary_oai_pubmedcentral_nih_gov_7032994
pubmed_primary_31940544
crossref_citationtrail_10_1109_TNSRE_2020_2966249
crossref_primary_10_1109_TNSRE_2020_2966249
ieee_primary_8957502
ProviderPackageCode CITATION
AAYXX
PublicationCentury 2000
PublicationDate 2020-02-01
PublicationDateYYYYMMDD 2020-02-01
PublicationDate_xml – month: 02
  year: 2020
  text: 2020-02-01
  day: 01
PublicationDecade 2020
PublicationPlace United States
PublicationPlace_xml – name: United States
– name: New York
PublicationTitle IEEE transactions on neural systems and rehabilitation engineering
PublicationTitleAbbrev TNSRE
PublicationTitleAlternate IEEE Trans Neural Syst Rehabil Eng
PublicationYear 2020
Publisher IEEE
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Publisher_xml – name: IEEE
– name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
References le (ref51) 2018
ref13
ref12
ref15
ref14
ref11
simonyan (ref49) 2014; 27
song (ref24) 2017
ref17
ref16
ref19
ref18
du (ref10) 2015
vakanski (ref8) 2016; 1
ref50
ref45
ref47
ref44
dempster (ref41) 1977; 39
zhang (ref43) 2018
ref7
ref9
ref4
ref3
ref6
ref5
ref35
ref34
ref37
ref36
ref31
ref30
ref32
ref2
ref1
ref39
ref38
shahroudy (ref48) 2016
mclachlan (ref42) 1989; 38
ref23
ref26
ref25
bishop (ref40) 2011
ref20
antón (ref33) 2015; 54
ref22
ref21
ref28
ref27
ref29
szegedy (ref46) 2016
References_xml – ident: ref7
  doi: 10.1186/s12883-017-0888-0
– ident: ref1
  doi: 10.2522/ptj.20100343
– start-page: 712
  year: 2018
  ident: ref43
  article-title: Dynamic temporal pyramid network: A closer look at multi-scale modeling for activity detection
  publication-title: Vision Computer
– ident: ref13
  doi: 10.1109/TIM.2012.2236792
– volume: 39
  start-page: 1
  year: 1977
  ident: ref41
  article-title: Maximum likelihood from incomplete data via the EM algorithm
  publication-title: J Roy Statist Soc Series B (Methodol )
  doi: 10.1111/j.2517-6161.1977.tb01600.x
– ident: ref45
  doi: 10.1109/CVPR.2016.119
– volume: 54
  start-page: 145
  year: 2015
  ident: ref33
  article-title: Exercise recognition for Kinect-based telerehabilitation
  publication-title: Methods Inf Med
  doi: 10.3414/ME13-01-0109
– ident: ref2
  doi: 10.4172/2329-9096.1000214
– ident: ref29
  doi: 10.1504/IJSNET.2019.099230
– ident: ref11
  doi: 10.3390/data3010002
– ident: ref28
  doi: 10.1109/TNSRE.2014.2326254
– ident: ref15
  doi: 10.1109/TSMCB.2012.2185694
– ident: ref18
  doi: 10.1109/JSYST.2014.2327792
– ident: ref50
  doi: 10.1109/TNSRE.2019.2923060
– ident: ref19
  doi: 10.1007/978-3-642-25446-8_4
– ident: ref37
  doi: 10.1109/ICIP.2005.1530635
– ident: ref6
  doi: 10.1186/1743-0003-11-3
– ident: ref31
  doi: 10.1016/j.asoc.2014.04.020
– ident: ref5
  doi: 10.1080/10749357.2016.1200292
– ident: ref44
  doi: 10.1109/ICCV.2017.317
– volume: 27
  start-page: 568
  year: 2014
  ident: ref49
  article-title: Two-stream convolutional networks for action recognition in videos
  publication-title: Proc Adv Neural Inf Process Syst
– ident: ref32
  doi: 10.1109/TBME.2015.2477095
– year: 2011
  ident: ref40
  publication-title: Pattern Recognition and Machine Learning
– ident: ref16
  doi: 10.1109/ISCAS.2018.8350967
– ident: ref47
  doi: 10.24963/ijcai.2018/109
– ident: ref30
  doi: 10.1109/TASSP.1978.1163055
– ident: ref3
  doi: 10.2522/ptj.20060260
– ident: ref21
  doi: 10.3390/s16010115
– ident: ref26
  doi: 10.1109/ICCV.2015.494
– ident: ref12
  doi: 10.1109/TRO.2006.886270
– ident: ref9
  doi: 10.1145/1460096.1460144
– year: 2016
  ident: ref46
  article-title: Inception-v4, inception-ResNet and the impact of residual connections on learning
  publication-title: arXiv 1602 07261
– ident: ref4
  doi: 10.1016/j.math.2009.12.004
– ident: ref34
  doi: 10.1016/j.jbi.2017.12.012
– start-page: 1010
  year: 2016
  ident: ref48
  article-title: NTU RGB+D: A large scale dataset for 3D human activity analysis
  publication-title: Proc IEEE Conf Comput Vis Pattern Recognit (CVPR)
– ident: ref17
  doi: 10.1109/JBHI.2015.2431472
– ident: ref38
  doi: 10.1007/BF00332918
– ident: ref25
  doi: 10.1109/CVPR.2017.173
– ident: ref22
  doi: 10.1109/CVPR.2016.115
– ident: ref20
  doi: 10.1007/978-3-642-40728-4_48
– ident: ref39
  doi: 10.4172/2329-9096.1000403
– ident: ref14
  doi: 10.1109/TBME.2005.856295
– ident: ref23
  doi: 10.1109/CVPR.2016.573
– volume: 1
  start-page: 118
  year: 2016
  ident: ref8
  article-title: Mathematical modeling and evaluation of human motions in physical therapy using mixture density neural networks
  publication-title: J Physiother Phys Rehabil
– ident: ref36
  doi: 10.1109/TNSRE.2013.2259640
– start-page: 4263
  year: 2017
  ident: ref24
  article-title: An end-to-end spatio-temporal attention model for human action recognition from skeleton data
  publication-title: Proc Assoc Adv Artif Intell (AAAI)
– start-page: 1110
  year: 2015
  ident: ref10
  article-title: Hierarchical recurrent neural network for skeleton based action recognition
  publication-title: Proc IEEE Conf Comput Vis Pattern Recognit (CVPR)
– year: 2018
  ident: ref51
  article-title: A fine-to-coarse convolutional neural network for 3D human action recognition
  publication-title: arXiv 1805 11790
– volume: 38
  start-page: 384
  year: 1989
  ident: ref42
  article-title: Mixture models: Inference and applications to clustering
  publication-title: Appl Stat J Roy Statist Soc Ser C
– ident: ref35
  doi: 10.1109/THMS.2015.2493536
– ident: ref27
  doi: 10.1109/ISBB.2011.6107680
SSID ssj0017657
Score 2.5955248
Snippet Computer-aided assessment of physical rehabilitation entails evaluation of patient performance in completing prescribed rehabilitation exercises, based on...
SourceID pubmedcentral
proquest
pubmed
crossref
ieee
SourceType Open Access Repository
Aggregation Database
Index Database
Enrichment Source
Publisher
StartPage 468
SubjectTerms Algorithms
Artificial neural networks
Automation
Biomechanical Phenomena
Body parts
Business metrics
Computational modeling
Computer Simulation
Data models
Deep Learning
Dimensionality reduction
Exercise Therapy - methods
Healthy Volunteers
Hidden Markov models
Human motion
Humans
Machine learning
Mapping
Measurement
Movement - physiology
Movement modeling
Neural networks
Neural Networks, Computer
Normal Distribution
Performance evaluation
Performance measurement
performance metrics
physical rehabilitation
Probabilistic models
Pyramids
Quality assessment
Rehabilitation
Rehabilitation - methods
Robustness (mathematics)
Solid modeling
Treatment Outcome
Title A Deep Learning Framework for Assessing Physical Rehabilitation Exercises
URI https://ieeexplore.ieee.org/document/8957502
https://www.ncbi.nlm.nih.gov/pubmed/31940544
https://www.proquest.com/docview/2357224338
https://www.proquest.com/docview/2339791222
https://pubmed.ncbi.nlm.nih.gov/PMC7032994
Volume 28
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1Nb9QwEB2VHlAv0FI-QltkJOAC2SaO4zjHCnZVkFqhspV6i9b2bEGgbEV3L_x6Zhwn6q4qxClR7ES2Zmy_ycy8AXiTl2gKnJOlampMVY6YmjmaVGGuMCsd-oKTk8_O9eml-nJVXm3BhyEXBhFD8BmO-Db48v3CrfhX2bGpCVwwc-QDMty6XK3BY1DpwOpJC1ilqpBZnyCT1cfT828XYzIFZTaShO7J4NiBh6R6hFWUWjuPQoGV-7DmZsjknTNo8hjO-tF3oSc_R6ulHbk_G8SO_zu9XXgUwag46bRnD7awfQJv7xIPi2nHOiDeiYs1Tu99-HwiPiHeiMjQei0mfZyXICAsOm8yP_8aVWHjC2Icqz3dPoXLyXj68TSNhRlSpyq5TH0uEQvrCiWdlnVpjdW0N6B1pfJa0aWaV2xL5sZr5qyv5wWXN_PK-pk1VfEMtttFiy9AzGrUzntFQC4UQzOZllZpW2a5JSwnE8h78TQujo-LZ_xqgvWS1U2QbsPSbaJ0E3g_vHPTcXb8s_c-i2LoGaWQwGGvBU1c1rcNcwMR5iGzPoHXQzMtSPayzFpcrLgPu0pzwl0JPO-UZvh2r3QJVGvqNHRgsu_1lvbH90D6TTszIQf18v7RHsAOz6kLJz-E7eXvFR4RWlraV2GZ_AVUgg9o
linkProvider IEEE
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV3fb9MwED5NQ4K98GswAgOMBLxAusRxHOdxglYdrBUanbS3qI6vgEDpxNoX_nruHCdaqwnxlCh2Ilt3tr_L3X0H8DrN0WS4IEvVlBirFDE2CzSxwlRhktfoMk5Onkz1-Fx9usgvduB9nwuDiD74DAd86335blmv-VfZkSkJXDBz5K2ck3HbbK3eZ1Boz-tJS1jFKpNJlyKTlEez6dezIRmDMhlIwvdkcuzBbVI-QitKbZxIvsTKTWhzO2jy2ik0ugeTbvxt8MnPwXplB_WfLWrH_53gfbgb4Kg4bvXnAexg8xDeXKceFrOWd0C8FWcbrN77cHIsPiJeisDR-k2MukgvQVBYtP5kfv4lKMPWF8Qw1Hu6egTno-HswzgOpRniWhVyFbtUIma2zpSstSxza6ym3QFtnSunFV2KRcHWZGqcZtb6cpFxgTOnrJtbU2SPYbdZNvgExLxEXTunSKS-HJpJtLRK2zxJLaE5GUHaiaeqw_i4fMavytsvSVl56VYs3SpIN4J3_TuXLWvHP3vvsyj6nkEKERx2WlCFhX1VMTsQoR4y7CN41TfTkmQ_y7zB5Zr7sLM0JeQVwUGrNP23O6WLoNhQp74D031vtjQ_vnvab9qbCTuopzeP9iXcGc8mp9XpyfTzM9jj-bXB5Yewu_q9xueEnVb2hV8yfwGYVRKw
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=A+Deep+Learning+Framework+for+Assessing+Physical+Rehabilitation+Exercises&rft.jtitle=IEEE+transactions+on+neural+systems+and+rehabilitation+engineering&rft.au=Liao%2C+Yalin&rft.au=Vakanski%2C+Aleksandar&rft.au=Xian%2C+Min&rft.date=2020-02-01&rft.issn=1534-4320&rft.eissn=1558-0210&rft.volume=28&rft.issue=2&rft.spage=468&rft.epage=477&rft_id=info:doi/10.1109%2FTNSRE.2020.2966249&rft_id=info%3Apmid%2F31940544&rft.externalDocID=PMC7032994
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1534-4320&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1534-4320&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1534-4320&client=summon