Machine Learning for Human Motion Intention Detection

The gait pattern of exoskeleton control conflicting with the human operator’s (the pilot) intention may cause awkward maneuvering or even injury. Therefore, it has been the focus of many studies to help decide the proper gait operation. However, the timing for the recognization plays a crucial role...

Full description

Saved in:
Bibliographic Details
Published inSensors (Basel, Switzerland) Vol. 23; no. 16; p. 7203
Main Authors Lin, Jun-Ji, Hsu, Che-Kang, Hsu, Wei-Li, Tsao, Tsu-Chin, Wang, Fu-Cheng, Yen, Jia-Yush
Format Journal Article
LanguageEnglish
Published Basel MDPI AG 01.08.2023
MDPI
Subjects
Online AccessGet full text

Cover

Loading…
Abstract The gait pattern of exoskeleton control conflicting with the human operator’s (the pilot) intention may cause awkward maneuvering or even injury. Therefore, it has been the focus of many studies to help decide the proper gait operation. However, the timing for the recognization plays a crucial role in the operation. The delayed detection of the pilot’s intent can be equally undesirable to the exoskeleton operation. Instead of recognizing the motion, this study examines the possibility of identifying the transition between gaits to achieve in-time detection. This study used the data from IMU sensors for future mobile applications. Furthermore, we tested using two machine learning networks: a linearfFeedforward neural network and a long short-term memory network. The gait data are from five subjects for training and testing. The study results show that: 1. The network can successfully separate the transition period from the motion periods. 2. The detection of gait change from walking to sitting can be as fast as 0.17 s, which is adequate for future control applications. However, detecting the transition from standing to walking can take as long as 1.2 s. 3. This study also find that the network trained for one person can also detect movement changes for different persons without deteriorating the performance.
AbstractList The gait pattern of exoskeleton control conflicting with the human operator's (the pilot) intention may cause awkward maneuvering or even injury. Therefore, it has been the focus of many studies to help decide the proper gait operation. However, the timing for the recognization plays a crucial role in the operation. The delayed detection of the pilot's intent can be equally undesirable to the exoskeleton operation. Instead of recognizing the motion, this study examines the possibility of identifying the transition between gaits to achieve in-time detection. This study used the data from IMU sensors for future mobile applications. Furthermore, we tested using two machine learning networks: a linearfFeedforward neural network and a long short-term memory network. The gait data are from five subjects for training and testing. The study results show that: 1. The network can successfully separate the transition period from the motion periods. 2. The detection of gait change from walking to sitting can be as fast as 0.17 s, which is adequate for future control applications. However, detecting the transition from standing to walking can take as long as 1.2 s. 3. This study also find that the network trained for one person can also detect movement changes for different persons without deteriorating the performance.The gait pattern of exoskeleton control conflicting with the human operator's (the pilot) intention may cause awkward maneuvering or even injury. Therefore, it has been the focus of many studies to help decide the proper gait operation. However, the timing for the recognization plays a crucial role in the operation. The delayed detection of the pilot's intent can be equally undesirable to the exoskeleton operation. Instead of recognizing the motion, this study examines the possibility of identifying the transition between gaits to achieve in-time detection. This study used the data from IMU sensors for future mobile applications. Furthermore, we tested using two machine learning networks: a linearfFeedforward neural network and a long short-term memory network. The gait data are from five subjects for training and testing. The study results show that: 1. The network can successfully separate the transition period from the motion periods. 2. The detection of gait change from walking to sitting can be as fast as 0.17 s, which is adequate for future control applications. However, detecting the transition from standing to walking can take as long as 1.2 s. 3. This study also find that the network trained for one person can also detect movement changes for different persons without deteriorating the performance.
The gait pattern of exoskeleton control conflicting with the human operator’s (the pilot) intention may cause awkward maneuvering or even injury. Therefore, it has been the focus of many studies to help decide the proper gait operation. However, the timing for the recognization plays a crucial role in the operation. The delayed detection of the pilot’s intent can be equally undesirable to the exoskeleton operation. Instead of recognizing the motion, this study examines the possibility of identifying the transition between gaits to achieve in-time detection. This study used the data from IMU sensors for future mobile applications. Furthermore, we tested using two machine learning networks: a linearfFeedforward neural network and a long short-term memory network. The gait data are from five subjects for training and testing. The study results show that: 1. The network can successfully separate the transition period from the motion periods. 2. The detection of gait change from walking to sitting can be as fast as 0.17 s, which is adequate for future control applications. However, detecting the transition from standing to walking can take as long as 1.2 s. 3. This study also find that the network trained for one person can also detect movement changes for different persons without deteriorating the performance.
Audience Academic
Author Hsu, Che-Kang
Hsu, Wei-Li
Wang, Fu-Cheng
Tsao, Tsu-Chin
Yen, Jia-Yush
Lin, Jun-Ji
AuthorAffiliation 3 Mechanical and Aerospace Engineering, Samueli School of Engineering, University of California, Los Angeles, CA 90095, USA
1 Department of Mechanical Engineering, National Taiwan University, No. 1, Sec. 4, Roosevelt Rd., Taipei City 106319, Taiwan
4 Department of Mechanical Engineering, National Taiwan University of Science and Technology, No. 43, Keelung Rd., Sec. 4, Da’an Dist., Taipei City 106335, Taiwan
2 School and Graduate Institute of Physical Therapy, National Taiwan University, No. 17, Xuzhou Rd., Zhongzheng Dist., Taipei City 100025, Taiwan
AuthorAffiliation_xml – name: 3 Mechanical and Aerospace Engineering, Samueli School of Engineering, University of California, Los Angeles, CA 90095, USA
– name: 4 Department of Mechanical Engineering, National Taiwan University of Science and Technology, No. 43, Keelung Rd., Sec. 4, Da’an Dist., Taipei City 106335, Taiwan
– name: 2 School and Graduate Institute of Physical Therapy, National Taiwan University, No. 17, Xuzhou Rd., Zhongzheng Dist., Taipei City 100025, Taiwan
– name: 1 Department of Mechanical Engineering, National Taiwan University, No. 1, Sec. 4, Roosevelt Rd., Taipei City 106319, Taiwan
Author_xml – sequence: 1
  givenname: Jun-Ji
  surname: Lin
  fullname: Lin, Jun-Ji
– sequence: 2
  givenname: Che-Kang
  surname: Hsu
  fullname: Hsu, Che-Kang
– sequence: 3
  givenname: Wei-Li
  surname: Hsu
  fullname: Hsu, Wei-Li
– sequence: 4
  givenname: Tsu-Chin
  surname: Tsao
  fullname: Tsao, Tsu-Chin
– sequence: 5
  givenname: Fu-Cheng
  orcidid: 0000-0001-5011-7934
  surname: Wang
  fullname: Wang, Fu-Cheng
– sequence: 6
  givenname: Jia-Yush
  surname: Yen
  fullname: Yen, Jia-Yush
BookMark eNptUstuFDEQtFAQecCBPxiJCxw28ftxQlFIyEobcYGz5fG0N17N2MEzE4m_x7sTRSRCPrjUrq7ucvcpOko5AUIfCT5nzOCLkTIiFcXsDTohnPKVphQf_YOP0ek47jCmjDH9Dh0zJRlRHJ8gcef8fUzQbMCVFNO2Cbk0t_PgUnOXp5hTs04TpAP6BhP4PXqP3gbXj_Dh6T5Dv26uf17drjY_vq-vLjcrz7WcVqwVzoDBslNUeqE9pW3gRAbFiHPQYkKFVspoaYBjZqQi3BPCgLVtJwWwM7RedLvsdvahxMGVPza7aA-BXLbWlSn6HqzH0gUXhOBccK24Ia1Rvg0Ea5AdqKr1ddF6mNsBOl89Fde_EH35kuK93eZHSzAXRgpWFT4_KZT8e4ZxskMcPfS9S5Dn0VItlOaMU12pn15Rd3kuqf7VgcW5wmLf0vnC2rrqIKaQa2FfTwdD9HXGIdb4pZK0WloSviwJvuRxLBCe2yfY7lfBPq9C5V684vo4uf30apHY_yfjL0YEssQ
CitedBy_id crossref_primary_10_1080_23311886_2025_2474863
crossref_primary_10_1038_s41598_025_90307_6
Cites_doi 10.1371/journal.pone.0200193
10.1609/aaai.v32i1.12328
10.1109/ICCV.2013.441
10.1109/MCE.2016.2614423
10.1016/j.proeng.2012.07.273
10.1016/j.patrec.2018.02.010
10.3390/s21165253
10.1016/j.engappai.2022.105702
10.1109/34.910878
10.1115/1.4047729
10.1007/s10846-019-01049-3
10.3389/fbioe.2020.00664
10.1080/01691864.2018.1490200
10.1109/ICIIBMS52876.2021.9651568
10.1109/ICARM.2018.8610692
10.1007/s11263-022-01594-9
10.1115/1.4045509
ContentType Journal Article
Copyright COPYRIGHT 2023 MDPI AG
2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
2023 by the authors. 2023
Copyright_xml – notice: COPYRIGHT 2023 MDPI AG
– notice: 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
– notice: 2023 by the authors. 2023
DBID AAYXX
CITATION
3V.
7X7
7XB
88E
8FI
8FJ
8FK
ABUWG
AFKRA
AZQEC
BENPR
CCPQU
DWQXO
FYUFA
GHDGH
K9.
M0S
M1P
PHGZM
PHGZT
PIMPY
PJZUB
PKEHL
PPXIY
PQEST
PQQKQ
PQUKI
7X8
5PM
DOA
DOI 10.3390/s23167203
DatabaseName CrossRef
ProQuest Central (Corporate)
Health & Medical Collection
ProQuest Central (purchase pre-March 2016)
Medical Database (Alumni Edition)
Hospital Premium Collection
Hospital Premium Collection (Alumni Edition)
ProQuest Central (Alumni) (purchase pre-March 2016)
ProQuest Central (Alumni Edition)
ProQuest Central UK/Ireland
ProQuest Central Essentials
ProQuest Central
ProQuest One Community College
ProQuest Central Korea
Health Research Premium Collection
Health Research Premium Collection (Alumni)
ProQuest Health & Medical Complete (Alumni)
Health & Medical Collection (Alumni Edition)
Medical Database
ProQuest Central Premium
ProQuest One Academic (New)
Publicly Available Content Database
ProQuest Health & Medical Research Collection
ProQuest One Academic Middle East (New)
ProQuest One Health & Nursing
ProQuest One Academic Eastern Edition (DO NOT USE)
ProQuest One Academic
ProQuest One Academic UKI Edition
MEDLINE - Academic
PubMed Central (Full Participant titles)
DOAJ Directory of Open Access Journals
DatabaseTitle CrossRef
Publicly Available Content Database
ProQuest One Academic Middle East (New)
ProQuest Central Essentials
ProQuest One Academic Eastern Edition
ProQuest Health & Medical Complete (Alumni)
ProQuest Central (Alumni Edition)
ProQuest One Community College
ProQuest One Health & Nursing
ProQuest Hospital Collection
Health Research Premium Collection (Alumni)
ProQuest Hospital Collection (Alumni)
ProQuest Central
ProQuest Health & Medical Complete
Health Research Premium Collection
ProQuest Medical Library
ProQuest One Academic UKI Edition
Health and Medicine Complete (Alumni Edition)
ProQuest Central Korea
Health & Medical Research Collection
ProQuest Central (New)
ProQuest One Academic
ProQuest One Academic (New)
ProQuest Medical Library (Alumni)
ProQuest Central (Alumni)
MEDLINE - Academic
DatabaseTitleList MEDLINE - Academic

CrossRef

Publicly Available Content Database

Database_xml – sequence: 1
  dbid: DOA
  name: Directory of Open Access Journals (DOAJ)
  url: https://www.doaj.org/
  sourceTypes: Open Website
– sequence: 2
  dbid: BENPR
  name: ProQuest Central
  url: https://www.proquest.com/central
  sourceTypes: Aggregation Database
DeliveryMethod fulltext_linktorsrc
Discipline Engineering
EISSN 1424-8220
ExternalDocumentID oai_doaj_org_article_c06afaf55445487491b97cbf108e6de7
PMC10459653
A762548057
10_3390_s23167203
GrantInformation_xml – fundername: Ministry of Science and Technology, Taiwan ROC
  grantid: 111-2811-E-011-014-MY3
GroupedDBID ---
123
2WC
53G
5VS
7X7
88E
8FE
8FG
8FI
8FJ
AADQD
AAHBH
AAYXX
ABDBF
ABUWG
ACUHS
ADBBV
ADMLS
AENEX
AFKRA
AFZYC
ALIPV
ALMA_UNASSIGNED_HOLDINGS
BENPR
BPHCQ
BVXVI
CCPQU
CITATION
CS3
D1I
DU5
E3Z
EBD
ESX
F5P
FYUFA
GROUPED_DOAJ
GX1
HH5
HMCUK
HYE
IAO
ITC
KQ8
L6V
M1P
M48
MODMG
M~E
OK1
OVT
P2P
P62
PHGZM
PHGZT
PIMPY
PQQKQ
PROAC
PSQYO
RNS
RPM
TUS
UKHRP
XSB
~8M
PMFND
3V.
7XB
8FK
AZQEC
DWQXO
K9.
PJZUB
PKEHL
PPXIY
PQEST
PQUKI
7X8
5PM
PUEGO
ID FETCH-LOGICAL-c486t-3b5a9e906d726c58c22bf416f731aaeb01258779869e40396714c113e3bbd65e3
IEDL.DBID M48
ISSN 1424-8220
IngestDate Wed Aug 27 01:31:26 EDT 2025
Thu Aug 21 18:36:34 EDT 2025
Mon Jul 21 11:15:42 EDT 2025
Fri Jul 25 02:15:22 EDT 2025
Tue Jun 10 21:29:30 EDT 2025
Tue Jul 01 01:20:19 EDT 2025
Thu Apr 24 23:10:44 EDT 2025
IsDoiOpenAccess true
IsOpenAccess true
IsPeerReviewed true
IsScholarly true
Issue 16
Language English
License https://creativecommons.org/licenses/by/4.0
Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c486t-3b5a9e906d726c58c22bf416f731aaeb01258779869e40396714c113e3bbd65e3
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ORCID 0000-0001-5011-7934
OpenAccessLink http://journals.scholarsportal.info/openUrl.xqy?doi=10.3390/s23167203
PMID 37631740
PQID 2857447057
PQPubID 2032333
ParticipantIDs doaj_primary_oai_doaj_org_article_c06afaf55445487491b97cbf108e6de7
pubmedcentral_primary_oai_pubmedcentral_nih_gov_10459653
proquest_miscellaneous_2857843428
proquest_journals_2857447057
gale_infotracacademiconefile_A762548057
crossref_primary_10_3390_s23167203
crossref_citationtrail_10_3390_s23167203
ProviderPackageCode CITATION
AAYXX
PublicationCentury 2000
PublicationDate 2023-08-01
PublicationDateYYYYMMDD 2023-08-01
PublicationDate_xml – month: 08
  year: 2023
  text: 2023-08-01
  day: 01
PublicationDecade 2020
PublicationPlace Basel
PublicationPlace_xml – name: Basel
PublicationTitle Sensors (Basel, Switzerland)
PublicationYear 2023
Publisher MDPI AG
MDPI
Publisher_xml – name: MDPI AG
– name: MDPI
References Bobick (ref_9) 2001; 23
Kong (ref_14) 2022; 130
Preatoni (ref_1) 2020; 8
ref_11
ref_21
ref_20
Wang (ref_12) 2019; 119
Anam (ref_4) 2012; 41
Cangelosi (ref_2) 2017; 6
Ito (ref_10) 2018; 32
Zhang (ref_3) 2019; 2019
ref_16
Chen (ref_13) 2021; 54
Hao (ref_18) 2020; 12
ref_8
Li (ref_17) 2019; 97
Khodabandelou (ref_15) 2023; 118
ref_5
Ji (ref_19) 2020; 12
ref_7
ref_6
References_xml – volume: 2019
  start-page: 3679174
  year: 2019
  ident: ref_3
  article-title: sEMG Based Human Motion Intention Recognition
  publication-title: J. Robot.
– ident: ref_6
  doi: 10.1371/journal.pone.0200193
– ident: ref_8
  doi: 10.1609/aaai.v32i1.12328
– ident: ref_7
  doi: 10.1109/ICCV.2013.441
– volume: 6
  start-page: 24
  year: 2017
  ident: ref_2
  article-title: Human-Robot Interaction and Neuroprosthetics: A review of new technologies
  publication-title: IEEE Consum. Electron. Mag.
  doi: 10.1109/MCE.2016.2614423
– volume: 41
  start-page: 988
  year: 2012
  ident: ref_4
  article-title: Active Exoskeleton Control Systems: State of the Art
  publication-title: Procedia Eng.
  doi: 10.1016/j.proeng.2012.07.273
– volume: 119
  start-page: 3
  year: 2019
  ident: ref_12
  article-title: Deep learning for sensor-based activity recognition: A survey
  publication-title: Pattern Recognit. Lett.
  doi: 10.1016/j.patrec.2018.02.010
– ident: ref_11
– ident: ref_16
  doi: 10.3390/s21165253
– volume: 54
  start-page: 1
  year: 2021
  ident: ref_13
  article-title: Deep Learning for Sensor-based Human Activity Recognition: Over-view, Challenges, and Opportunities
  publication-title: ACM Comput. Surv. CSUR
– volume: 118
  start-page: 105702
  year: 2023
  ident: ref_15
  article-title: A fuzzy convolutional attention-based GRU network for human activity recognition
  publication-title: Eng. Appl. Artif. Intell.
  doi: 10.1016/j.engappai.2022.105702
– volume: 23
  start-page: 257
  year: 2001
  ident: ref_9
  article-title: The recognition of human movement using temporal templates
  publication-title: IEEE Trans. Pattern Anal. Mach. Intell.
  doi: 10.1109/34.910878
– volume: 12
  start-page: 061014
  year: 2020
  ident: ref_18
  article-title: Supernumerary Robotic Limbs to Assist Human Walking with Load Carriage
  publication-title: J. Mech. Robot.
  doi: 10.1115/1.4047729
– volume: 97
  start-page: 95
  year: 2019
  ident: ref_17
  article-title: Deep-Learning-Based Human Intention Prediction Using RGB Images and Optical Flow
  publication-title: J. Intell. Robot. Syst.
  doi: 10.1007/s10846-019-01049-3
– volume: 8
  start-page: 664
  year: 2020
  ident: ref_1
  article-title: Supervised Machine Learning Applied to Wearable Sensor Data Can Accurately Classify Functional Fitness Exercises within a Continuous Workout
  publication-title: Front. Bioeng. Biotechnol.
  doi: 10.3389/fbioe.2020.00664
– volume: 32
  start-page: 635
  year: 2018
  ident: ref_10
  article-title: Evaluation of active wearable assistive devices with human posture reproduction using a humanoid robot
  publication-title: Adv. Robot.
  doi: 10.1080/01691864.2018.1490200
– ident: ref_21
  doi: 10.1109/ICIIBMS52876.2021.9651568
– ident: ref_20
– ident: ref_5
  doi: 10.1109/ICARM.2018.8610692
– volume: 130
  start-page: 1366
  year: 2022
  ident: ref_14
  article-title: Human Action Recognition and Prediction: A Survey
  publication-title: Int. J. Comput. Vis.
  doi: 10.1007/s11263-022-01594-9
– volume: 12
  start-page: 031007
  year: 2020
  ident: ref_19
  article-title: Design and Analysis of a Smart Rehabilitation Walker with Passive Pelvic Mechanism
  publication-title: J. Mech. Robot.
  doi: 10.1115/1.4045509
SSID ssj0023338
Score 2.4383576
Snippet The gait pattern of exoskeleton control conflicting with the human operator’s (the pilot) intention may cause awkward maneuvering or even injury. Therefore, it...
The gait pattern of exoskeleton control conflicting with the human operator's (the pilot) intention may cause awkward maneuvering or even injury. Therefore, it...
SourceID doaj
pubmedcentral
proquest
gale
crossref
SourceType Open Website
Open Access Repository
Aggregation Database
Enrichment Source
Index Database
StartPage 7203
SubjectTerms Calibration
Data collection
feedforward neural network (FNN)
Gait
human intention detection
human–robot interaction
inertial measurement unit (IMU)
Kinematics
long short-term memory (LSTM)
Machine learning
Mobile applications
Neural networks
Robotics
Robots
Sensors
Technical Note
Wireless telephone software
SummonAdditionalLinks – databaseName: DOAJ Directory of Open Access Journals
  dbid: DOA
  link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV1LS8QwEA6yJz2IT1xfVBH0UrZt3kefiLCeXNhbaNKJClJF1__vJM0uuyp48VaaoSQzSWa-dPINISccnHeiaHLfSJEzJl2unPA5bdCfIG4GBuGi8PBe3I7Y3ZiP50p9hZywjh64U9zAFaL2teeBNAaDa6ZLq6WzviwUiAbiPXL0eVMwlaAWReTV8QhRBPWDjypc-K6mlbGS94kk_T-34u_pkXP-5maNrKZAMTvvOrhOlqDdICtz9IGbhA9jJiRkiST1McMINIvH8tkwVufJYoZ6fLqCScy6arfI6Ob64fI2T2UQcseUmOTU8lqDLkQjK-G4clVlPcZRXtKyrsGii-FKSq2EBlZQLWTJXBlON61tBAe6TXrtaws7JJNChiIzqmYSmPOgrCq09BK_zjxqtk_OpuoxLnGEh1IVLwaxQtCkmWmyT45nom8dMcZvQhdBxzOBwGUdX6CFTbKw-cvCfXIaLGTCisPOuDpdHMAhBe4qc477eWCt4yi5PzWiSUvxw1SKS5yDsflo1oyLKPwZqVt4_exkFKMIxfpELRh_oeuLLe3zU6TjRkDLteB09z8Gu0eWQ0H7LsVwn_Qm759wgGHPxB7GGf4FeXL_7Q
  priority: 102
  providerName: Directory of Open Access Journals
– databaseName: Health & Medical Collection
  dbid: 7X7
  link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwfV1Lb9QwEB6V9lIOqEARoQWFCqm9RE3i9wn1qarScqLS3qzYsQtSlS3d7f9nxvFuu4C4RfHIcTyelz3-BuCLCD56WfdV7JWsOFe-0l7GivVoTzBuDjzQReHJN3l1w6-nYpo33OY5rXKpE5Oi7mee9siPWy0UdoXuxdf7XxVVjaLT1VxC4wVsEXQZpXSp6VPAxTD-GtGEGIb2x_OWrn23y_pY2QYlqP6_FfKfSZLPrM7lDrzK7mJ5MvL3NWyE4Q28fAYi-BbEJOVDhjJDpd6W6IeWaXO-nKQaPWXKU09P52GRcq-GXbi5vPh-dlXlYgiV51ouKuZEZ4KpZa9a6YX2besielNRsabrgkNDI7RSRksTeM2MVA33De1xOtdLEdg72BxmQ3gPpZKKSs3ojqvAfQza6dqoqLB3Hk3jCjhaTo_1GSmcClbcWYwYaCbtaiYLOFiR3o_wGP8iOqU5XhEQonV6MXu4tVlArK9lF7soCBwIgyiO4zDKu9jUOsg-qAIOiUOW5A4H47t8fQB_iRCs7AlqdcKuE0i5v2SizQI5t0_Lp4DPq2YUJTof6YYwexxpNGcYkBWg15i_NvT1luHnjwTKjWGtMFKwD___-h5sU8H6MYVwHzYXD4_hI7o1C_cprd3fBun3Sg
  priority: 102
  providerName: ProQuest
Title Machine Learning for Human Motion Intention Detection
URI https://www.proquest.com/docview/2857447057
https://www.proquest.com/docview/2857843428
https://pubmed.ncbi.nlm.nih.gov/PMC10459653
https://doaj.org/article/c06afaf55445487491b97cbf108e6de7
Volume 23
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV3db9QwDI_G9gIPiE9xME4FIcFLoW0-nDwgtMGOCekmhDjp3qImTQbS1NtuNwn-e-y0V62wB16qqnGr1E5iO7F_ZuyVDD56VTR5bEDlQoDPtVcx5w3qE_SbgwiUKDw_UccL8WUplztsW2OzZ-Dlja4d1ZNarM_e_rr4_QEn_HvyONFlf3dZUTp3RZife6iQgObnXAyHCRXnqaA15XTlqA-LDmBo_OpILSX0_n_X6L_jJq8potk9dre3ILODTuT32U5oH7A713AFHzI5TyGSIevRU08zNE2ztF-fzVPZniyFrqe7T2GTwrHaR2wxO_r-8Tjv6yPkXmi1ybmTtQmmUA1Uykvtq8pFNLAi8LKug0PdIzWA0coEUXCjoBS-pG1P5xolA3_MdttVG56wDBRQ9RldCwjCx6CdLgxEwK-LaEo3YW-27LG-Bw-nGhZnFp0I4qQdODlhLwfS8w4x4yaiQ-LxQEAg1-nBan1q-zljfaHqWEdJeEHoVwnshwHvYlnooJoAE_aaJGRpcGBnfN1nFOAvEaiVPcCFnuDsJFLub4Vot0PMVloCDs7U_GJoxtlFRyZ1G1ZXHY0WHH20CdMj4Y-6Pm5pf_5ION3o6UqjJH_6Pxx5xm5TJfsutnCf7W7WV-E52jsbN2W3YAl41bPPU7Z3eHTy9ds07R1M0zj_A7PbAEo
linkProvider Scholars Portal
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV1Lb9QwEB6VcgAOFU81tIWAQHCJmsTvQ4UKpdrSbk-ttDeTOHZBQtnS3Qrxp_iNzDjJtguIW29RPHKc8Yxnxh5_A_BKeBeczJssNEpmnCuXaSdDxhq0Jxg3e-7povD4WI5O-aeJmKzAr-EuDKVVDmtiXKibqaM98u1SC4VdoXvx7vx7RlWj6HR1KKHRicWh__kDQ7bZzsEezu_rstz_ePJhlPVVBTLHtZxnrBaV8SaXjSqlE9qVZR3QLQmKFVXla1yxhVbKaGk8z5mRquCuoM3Cum6k8Az7vQW30fDmpFFqchXgMYz3OvQixky-PSvpmnk51OPqbV4sDfC3AfgzKfOaldu_D2u9e5rudvL0AFZ8-xDuXQMtfARiHPMvfdpDs56l6Pem8TAgHceaQGnMi49Pe34ec73ax3B6I2x6AqvttPXrkCqpqLSNrrjy3AWva50bFRT2zoMp6gTeDuyxrkcmpwIZ3yxGKMRJu-BkAi8XpOcdHMe_iN4TjxcEhKAdX0wvzmyvkNblsgpVEARGhEEbx3EY5epQ5NrLxqsE3tAMWdJzHIyr-usK-EuEmGV30YoQVp5Ays1hEm2_AMzslbgm8GLRjKpL5zFV66eXHY3mDAPABPTS5C8Nfbml_folgoBjGC2MFOzp_7_-HO6MTsZH9ujg-HAD7pYopF364iaszi8u_Ra6VPP6WZTjFD7ftOL8BvVQMpI
linkToPdf http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV1Lb9QwEB6VIiE4IJ4ipUBAILhEm8TvA0KFZdVStuJApb2ZxLELUpUt3a0Qf41fx4yTbLuAuPUWxSPHGc94ZuzxNwDPhXfBybzJQqNkxrlymXYyZKxBe4Jxs-eeLgpPD-TuIf8wE7MN-DXchaG0ymFNjAt1M3e0Rz4qtVDYFboXo9CnRXwaT96cfM-oghSdtA7lNDoR2fc_f2D4tni9N8a5flGWk_ef3-1mfYWBzHEtlxmrRWW8yWWjSumEdmVZB3RRgmJFVfkaV2-hlTJaGs9zZqQquCto47CuGyk8w36vwFXFREE6pmbnwR7D2K9DMmLM5KNFSVfOy6E2V2__YpmAv43BnwmaFyze5Bbc7F3VdKeTrduw4ds7cOMCgOFdENOYi-nTHqb1KEUfOI0HA-k01gdKY458fBr7Zcz7au_B4aWw6T5stvPWP4BUSUVlbnTFlecueF3r3KigsHceTFEn8Gpgj3U9SjkVyzi2GK0QJ-2Kkwk8W5GedNAc_yJ6SzxeERCadnwxPz2yvXJal8sqVEEQMBEGcBzHYZSrQ5FrLxuvEnhJM2RJ53EwruqvLuAvEXqW3UGLQrh5Aim3h0m0_WKwsOeim8DTVTOqMZ3NVK2fn3U0mjMMBhPQa5O_NvT1lvbb1wgIjiG1MFKwrf9__QlcQ5WxH_cO9h_C9RJltMtk3IbN5emZf4Te1bJ-HMU4hS-XrTe_ATF7Nsg
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Machine+Learning+for+Human+Motion+Intention+Detection&rft.jtitle=Sensors+%28Basel%2C+Switzerland%29&rft.au=Lin%2C+Jun-Ji&rft.au=Hsu%2C+Che-Kang&rft.au=Hsu%2C+Wei-Li&rft.au=Tsao%2C+Tsu-Chin&rft.date=2023-08-01&rft.issn=1424-8220&rft.eissn=1424-8220&rft.volume=23&rft.issue=16&rft.spage=7203&rft_id=info:doi/10.3390%2Fs23167203&rft.externalDBID=n%2Fa&rft.externalDocID=10_3390_s23167203
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1424-8220&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1424-8220&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1424-8220&client=summon