Recognition of Upper Limb Action Intention Based on IMU
Using motion information of the upper limb to control the prosthetic hand has become a hotspot of current research. The operation of the prosthetic hand must also be coordinated with the user’s intention. Therefore, identifying action intention of the upper limb based on motion information of the up...
Saved in:
Published in | Sensors (Basel, Switzerland) Vol. 22; no. 5; p. 1954 |
---|---|
Main Authors | , , , , |
Format | Journal Article |
Language | English |
Published |
Switzerland
MDPI AG
02.03.2022
MDPI |
Subjects | |
Online Access | Get full text |
ISSN | 1424-8220 1424-8220 |
DOI | 10.3390/s22051954 |
Cover
Loading…
Abstract | Using motion information of the upper limb to control the prosthetic hand has become a hotspot of current research. The operation of the prosthetic hand must also be coordinated with the user’s intention. Therefore, identifying action intention of the upper limb based on motion information of the upper limb is key to controlling the prosthetic hand. Since a wearable inertial sensor bears the advantages of small size, low cost, and little external environment interference, we employ an inertial sensor to collect angle and angular velocity data during movement of the upper limb. Aiming at the action classification for putting on socks, putting on shoes and tying shoelaces, this paper proposes a recognition model based on the Dynamic Time Warping (DTW) algorithm of the motion unit. Based on whether the upper limb is moving, the complete motion data are divided into several motion units. Considering the delay associated with controlling the prosthetic hand, this paper only performs feature extraction on the first motion unit and the second motion unit, and recognizes action on different classifiers. The experimental results reveal that the DTW algorithm based on motion unit bears a higher recognition rate and lower running time. The recognition rate reaches as high as 99.46%, and the average running time measures 8.027 ms. In order to enable the prosthetic hand to understand the grasping intention of the upper limb, this paper proposes a Generalized Regression Neural Network (GRNN) model based on 10-fold cross-validation. The motion state of the upper limb is subdivided, and the static state is used as the sign of controlling the prosthetic hand. This paper applies a 10-fold cross-validation method to train the neural network model to find the optimal smoothing parameter. In addition, the recognition performance of different neural networks is compared. The experimental results show that the GRNN model based on 10-fold cross-validation exhibits a high accuracy rate, capable of reaching 98.28%. Finally, the two algorithms proposed in this paper are implemented in an experiment of using the prosthetic hand to reproduce an action, and the feasibility and practicability of the algorithm are verified by experiment. |
---|---|
AbstractList | Using motion information of the upper limb to control the prosthetic hand has become a hotspot of current research. The operation of the prosthetic hand must also be coordinated with the user’s intention. Therefore, identifying action intention of the upper limb based on motion information of the upper limb is key to controlling the prosthetic hand. Since a wearable inertial sensor bears the advantages of small size, low cost, and little external environment interference, we employ an inertial sensor to collect angle and angular velocity data during movement of the upper limb. Aiming at the action classification for putting on socks, putting on shoes and tying shoelaces, this paper proposes a recognition model based on the Dynamic Time Warping (DTW) algorithm of the motion unit. Based on whether the upper limb is moving, the complete motion data are divided into several motion units. Considering the delay associated with controlling the prosthetic hand, this paper only performs feature extraction on the first motion unit and the second motion unit, and recognizes action on different classifiers. The experimental results reveal that the DTW algorithm based on motion unit bears a higher recognition rate and lower running time. The recognition rate reaches as high as 99.46%, and the average running time measures 8.027 ms. In order to enable the prosthetic hand to understand the grasping intention of the upper limb, this paper proposes a Generalized Regression Neural Network (GRNN) model based on 10-fold cross-validation. The motion state of the upper limb is subdivided, and the static state is used as the sign of controlling the prosthetic hand. This paper applies a 10-fold cross-validation method to train the neural network model to find the optimal smoothing parameter. In addition, the recognition performance of different neural networks is compared. The experimental results show that the GRNN model based on 10-fold cross-validation exhibits a high accuracy rate, capable of reaching 98.28%. Finally, the two algorithms proposed in this paper are implemented in an experiment of using the prosthetic hand to reproduce an action, and the feasibility and practicability of the algorithm are verified by experiment. Using motion information of the upper limb to control the prosthetic hand has become a hotspot of current research. The operation of the prosthetic hand must also be coordinated with the user's intention. Therefore, identifying action intention of the upper limb based on motion information of the upper limb is key to controlling the prosthetic hand. Since a wearable inertial sensor bears the advantages of small size, low cost, and little external environment interference, we employ an inertial sensor to collect angle and angular velocity data during movement of the upper limb. Aiming at the action classification for putting on socks, putting on shoes and tying shoelaces, this paper proposes a recognition model based on the Dynamic Time Warping (DTW) algorithm of the motion unit. Based on whether the upper limb is moving, the complete motion data are divided into several motion units. Considering the delay associated with controlling the prosthetic hand, this paper only performs feature extraction on the first motion unit and the second motion unit, and recognizes action on different classifiers. The experimental results reveal that the DTW algorithm based on motion unit bears a higher recognition rate and lower running time. The recognition rate reaches as high as 99.46%, and the average running time measures 8.027 ms. In order to enable the prosthetic hand to understand the grasping intention of the upper limb, this paper proposes a Generalized Regression Neural Network (GRNN) model based on 10-fold cross-validation. The motion state of the upper limb is subdivided, and the static state is used as the sign of controlling the prosthetic hand. This paper applies a 10-fold cross-validation method to train the neural network model to find the optimal smoothing parameter. In addition, the recognition performance of different neural networks is compared. The experimental results show that the GRNN model based on 10-fold cross-validation exhibits a high accuracy rate, capable of reaching 98.28%. Finally, the two algorithms proposed in this paper are implemented in an experiment of using the prosthetic hand to reproduce an action, and the feasibility and practicability of the algorithm are verified by experiment.Using motion information of the upper limb to control the prosthetic hand has become a hotspot of current research. The operation of the prosthetic hand must also be coordinated with the user's intention. Therefore, identifying action intention of the upper limb based on motion information of the upper limb is key to controlling the prosthetic hand. Since a wearable inertial sensor bears the advantages of small size, low cost, and little external environment interference, we employ an inertial sensor to collect angle and angular velocity data during movement of the upper limb. Aiming at the action classification for putting on socks, putting on shoes and tying shoelaces, this paper proposes a recognition model based on the Dynamic Time Warping (DTW) algorithm of the motion unit. Based on whether the upper limb is moving, the complete motion data are divided into several motion units. Considering the delay associated with controlling the prosthetic hand, this paper only performs feature extraction on the first motion unit and the second motion unit, and recognizes action on different classifiers. The experimental results reveal that the DTW algorithm based on motion unit bears a higher recognition rate and lower running time. The recognition rate reaches as high as 99.46%, and the average running time measures 8.027 ms. In order to enable the prosthetic hand to understand the grasping intention of the upper limb, this paper proposes a Generalized Regression Neural Network (GRNN) model based on 10-fold cross-validation. The motion state of the upper limb is subdivided, and the static state is used as the sign of controlling the prosthetic hand. This paper applies a 10-fold cross-validation method to train the neural network model to find the optimal smoothing parameter. In addition, the recognition performance of different neural networks is compared. The experimental results show that the GRNN model based on 10-fold cross-validation exhibits a high accuracy rate, capable of reaching 98.28%. Finally, the two algorithms proposed in this paper are implemented in an experiment of using the prosthetic hand to reproduce an action, and the feasibility and practicability of the algorithm are verified by experiment. |
Author | Cui, Jian-Wei Yan, Bing-Yan Du, Han Li, Zhi-Gang Lu, Pu-Dong |
AuthorAffiliation | School of Instrument Science and Engineering, Southeast University, Nanjing 210096, China; 220193290@seu.edu.cn (Z.-G.L.); 220203465@seu.edu.cn (H.D.); 220203464@seu.edu.cn (B.-Y.Y.); 220193325@seu.edu.cn (P.-D.L.) |
AuthorAffiliation_xml | – name: School of Instrument Science and Engineering, Southeast University, Nanjing 210096, China; 220193290@seu.edu.cn (Z.-G.L.); 220203465@seu.edu.cn (H.D.); 220203464@seu.edu.cn (B.-Y.Y.); 220193325@seu.edu.cn (P.-D.L.) |
Author_xml | – sequence: 1 givenname: Jian-Wei surname: Cui fullname: Cui, Jian-Wei – sequence: 2 givenname: Zhi-Gang orcidid: 0000-0002-7796-1032 surname: Li fullname: Li, Zhi-Gang – sequence: 3 givenname: Han surname: Du fullname: Du, Han – sequence: 4 givenname: Bing-Yan surname: Yan fullname: Yan, Bing-Yan – sequence: 5 givenname: Pu-Dong surname: Lu fullname: Lu, Pu-Dong |
BackLink | https://www.ncbi.nlm.nih.gov/pubmed/35271101$$D View this record in MEDLINE/PubMed |
BookMark | eNptkUtrWzEQhUVIyatd5A-EC920Czd6XuluCknow-BSKPVaSLojR-ZacqTrQP99ZTs1SehKw5lPh5k55-g4pggIXRL8ibEOXxdKsSCd4EfojHDKJ6oKx8_qU3ReyhJjyhhTJ-iUCSoJweQMyV_g0iKGMaTYJN_M12vIzSysbHPjduI0jhB31a0p0Ddb6cf8LXrjzVDg3dN7geZfv_y--z6Z_fw2vbuZTRxvu3EiLFWMMvBgewIWM2u4h7brW8kkKNODo1J6DoYT41zbt1h0xnqLlbEOPLtA071vn8xSr3NYmfxHJxP0Tkh5oU0egxtAUy4AvFCOqI5L5ayQgnNnqJPWONlXr897r_XGrqB3da1shhemLzsx3OtFetSqI7xjbTX48GSQ08MGyqhXoTgYBhMhbYqmLVOSMIZVRd-_Qpdpk2M91ZaSUrWckUpdPZ_oMMq_fCrwcQ-4nErJ4A8IwXqbvT5kX9nrV6wLo9kmV5cJw39-_AVA8a80 |
CitedBy_id | crossref_primary_10_1016_j_compeleceng_2025_110094 crossref_primary_10_1109_LRA_2025_3539545 crossref_primary_10_3390_bioengineering10050526 crossref_primary_10_1109_ACCESS_2024_3365661 crossref_primary_10_1016_j_engappai_2023_107424 crossref_primary_10_1108_SR_10_2024_0887 crossref_primary_10_3390_s24186119 crossref_primary_10_3390_s22082986 crossref_primary_10_3390_s22145305 crossref_primary_10_3934_mbe_2023652 crossref_primary_10_1146_annurev_bioeng_082222_012531 crossref_primary_10_3390_s23135854 crossref_primary_10_1109_TNSRE_2024_3486444 crossref_primary_10_3390_s23115277 |
Cites_doi | 10.3390/s21237791 10.3390/electronics8040375 10.1109/IWCMC51323.2021.9498672 10.1016/j.envsoft.2014.05.010 10.1109/TIP.2017.2765821 10.1109/AMC.2019.8371172 10.1109/ICAEEE.2018.8642968 10.1109/TBME.2018.2865941 10.1007/s00521-019-04142-8 10.1007/978-3-319-93000-8_42 10.3390/s19214662 10.1109/TNSRE.2019.2959243 10.1109/I2MTC43012.2020.9128722 10.1109/TMECH.2007.897262 10.1109/TBME.2011.2113182 10.1155/2021/6638038 10.3390/s21165385 10.3390/s20215989 10.1007/978-3-030-41117-6_33 10.3390/s19224887 10.1109/JSEN.2019.2936171 10.1109/LSENS.2021.3074958 10.1109/ACCESS.2020.3012456 |
ContentType | Journal Article |
Copyright | 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. 2022 by the authors. 2022 |
Copyright_xml | – notice: 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. – notice: 2022 by the authors. 2022 |
DBID | AAYXX CITATION CGR CUY CVF ECM EIF NPM 3V. 7X7 7XB 88E 8FI 8FJ 8FK ABUWG AFKRA AZQEC BENPR CCPQU DWQXO FYUFA GHDGH K9. M0S M1P PHGZM PHGZT PIMPY PJZUB PKEHL PPXIY PQEST PQQKQ PQUKI PRINS 7X8 5PM DOA |
DOI | 10.3390/s22051954 |
DatabaseName | CrossRef Medline MEDLINE MEDLINE (Ovid) MEDLINE MEDLINE PubMed ProQuest Central (Corporate) Health & Medical Collection ProQuest Central (purchase pre-March 2016) Medical Database (Alumni Edition) Hospital Premium Collection Hospital Premium Collection (Alumni Edition) ProQuest Central (Alumni) (purchase pre-March 2016) ProQuest Central (Alumni Edition) ProQuest Central UK/Ireland ProQuest Central Essentials - QC ProQuest Central ProQuest One Community College ProQuest Central Korea Health Research Premium Collection Health Research Premium Collection (Alumni) ProQuest Health & Medical Complete (Alumni) Health & Medical Collection (Alumni Edition) Medical Database ProQuest Central Premium ProQuest One Academic (New) Publicly Available Content Database ProQuest Health & Medical Research Collection ProQuest One Academic Middle East (New) ProQuest One Health & Nursing ProQuest One Academic Eastern Edition (DO NOT USE) ProQuest One Academic ProQuest One Academic UKI Edition ProQuest Central China MEDLINE - Academic PubMed Central (Full Participant titles) DOAJ Directory of Open Access Journals |
DatabaseTitle | CrossRef MEDLINE Medline Complete MEDLINE with Full Text PubMed MEDLINE (Ovid) Publicly Available Content Database ProQuest One Academic Middle East (New) ProQuest Central Essentials ProQuest Health & Medical Complete (Alumni) ProQuest Central (Alumni Edition) ProQuest One Community College ProQuest One Health & Nursing ProQuest Central China ProQuest Central ProQuest Health & Medical Research Collection Health Research Premium Collection Health and Medicine Complete (Alumni Edition) ProQuest Central Korea Health & Medical Research Collection ProQuest Central (New) ProQuest Medical Library (Alumni) ProQuest One Academic Eastern Edition ProQuest Hospital Collection Health Research Premium Collection (Alumni) ProQuest Hospital Collection (Alumni) ProQuest Health & Medical Complete ProQuest Medical Library ProQuest One Academic UKI Edition ProQuest One Academic ProQuest One Academic (New) ProQuest Central (Alumni) MEDLINE - Academic |
DatabaseTitleList | CrossRef Publicly Available Content Database MEDLINE - Academic MEDLINE |
Database_xml | – sequence: 1 dbid: DOA name: Directory of Open Access Journals url: https://www.doaj.org/ sourceTypes: Open Website – sequence: 2 dbid: NPM name: PubMed url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed sourceTypes: Index Database – sequence: 3 dbid: EIF name: MEDLINE url: https://proxy.k.utb.cz/login?url=https://www.webofscience.com/wos/medline/basic-search sourceTypes: Index Database – sequence: 4 dbid: BENPR name: ProQuest Central url: https://www.proquest.com/central sourceTypes: Aggregation Database |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Engineering |
EISSN | 1424-8220 |
ExternalDocumentID | oai_doaj_org_article_245eef58c189478cb57544ca2c7bac7d PMC8914936 35271101 10_3390_s22051954 |
Genre | Journal Article |
GeographicLocations | Beijing China China |
GeographicLocations_xml | – name: China – name: Beijing China |
GrantInformation_xml | – fundername: National Natural Science Foundation of China grantid: 61873063 |
GroupedDBID | --- 123 2WC 53G 5VS 7X7 88E 8FE 8FG 8FI 8FJ AADQD AAHBH AAYXX ABDBF ABUWG ACUHS ADBBV ADMLS AENEX AFKRA AFZYC ALIPV ALMA_UNASSIGNED_HOLDINGS BENPR BPHCQ BVXVI CCPQU CITATION CS3 D1I DU5 E3Z EBD ESX F5P FYUFA GROUPED_DOAJ GX1 HH5 HMCUK HYE IAO ITC KQ8 L6V M1P M48 MODMG M~E OK1 OVT P2P P62 PHGZM PHGZT PIMPY PQQKQ PROAC PSQYO RNS RPM TUS UKHRP XSB ~8M CGR CUY CVF ECM EIF NPM PJZUB PPXIY 3V. 7XB 8FK AZQEC DWQXO K9. PKEHL PQEST PQUKI PRINS 7X8 5PM PUEGO |
ID | FETCH-LOGICAL-c469t-5b28323efebd1eb03ba4fe69d6737e8adec277f4ea41acc6d6059abfb08abcef3 |
IEDL.DBID | M48 |
ISSN | 1424-8220 |
IngestDate | Wed Aug 27 01:28:06 EDT 2025 Thu Aug 21 13:42:26 EDT 2025 Fri Jul 11 04:37:49 EDT 2025 Fri Jul 25 20:40:51 EDT 2025 Mon Jul 21 06:04:13 EDT 2025 Tue Jul 01 02:41:46 EDT 2025 Thu Apr 24 22:55:34 EDT 2025 |
IsDoiOpenAccess | true |
IsOpenAccess | true |
IsPeerReviewed | true |
IsScholarly | true |
Issue | 5 |
Keywords | prosthetic hand control 10-fold cross-validation dividing motion unit inertial sensor action intention recognition of upper limb |
Language | English |
License | https://creativecommons.org/licenses/by/4.0 Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-c469t-5b28323efebd1eb03ba4fe69d6737e8adec277f4ea41acc6d6059abfb08abcef3 |
Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 content type line 23 |
ORCID | 0000-0002-7796-1032 |
OpenAccessLink | https://doaj.org/article/245eef58c189478cb57544ca2c7bac7d |
PMID | 35271101 |
PQID | 2637786431 |
PQPubID | 2032333 |
ParticipantIDs | doaj_primary_oai_doaj_org_article_245eef58c189478cb57544ca2c7bac7d pubmedcentral_primary_oai_pubmedcentral_nih_gov_8914936 proquest_miscellaneous_2638713308 proquest_journals_2637786431 pubmed_primary_35271101 crossref_primary_10_3390_s22051954 crossref_citationtrail_10_3390_s22051954 |
ProviderPackageCode | CITATION AAYXX |
PublicationCentury | 2000 |
PublicationDate | 20220302 |
PublicationDateYYYYMMDD | 2022-03-02 |
PublicationDate_xml | – month: 3 year: 2022 text: 20220302 day: 2 |
PublicationDecade | 2020 |
PublicationPlace | Switzerland |
PublicationPlace_xml | – name: Switzerland – name: Basel |
PublicationTitle | Sensors (Basel, Switzerland) |
PublicationTitleAlternate | Sensors (Basel) |
PublicationYear | 2022 |
Publisher | MDPI AG MDPI |
Publisher_xml | – name: MDPI AG – name: MDPI |
References | Andrea (ref_15) 2020; 20 Qi (ref_28) 2020; 3 Alkhafaf (ref_9) 2020; 44 Chakaveh (ref_3) 2021; 4 ref_11 Krasoulis (ref_16) 2019; 28 ref_18 ref_17 Tong (ref_14) 2021; 5 Ulhaq (ref_19) 2017; 27 Ji (ref_20) 2020; 8 Li (ref_29) 2014; 59 Ali (ref_6) 2021; 4 Fuad (ref_13) 2018; 10882 Iliukhin (ref_2) 2017; 176 Scheme (ref_12) 2011; 58 Yuanxi (ref_24) 2019; 19 ref_25 Mishchenko (ref_7) 2019; 66 ref_21 Kusche (ref_4) 2019; 19 ref_1 Chu (ref_10) 2007; 12 ref_27 ref_26 Ning (ref_22) 2021; 2021 Gundogdu (ref_8) 2017; 30 ref_5 Ghulam (ref_23) 2019; 8 |
References_xml | – ident: ref_25 doi: 10.3390/s21237791 – volume: 8 start-page: 375 year: 2019 ident: ref_23 article-title: Indoor Positioning System: A New Approach Based on LSTM and Two Stage Activity Classification publication-title: Electronics doi: 10.3390/electronics8040375 – ident: ref_26 doi: 10.1109/IWCMC51323.2021.9498672 – ident: ref_11 – volume: 59 start-page: 162 year: 2014 ident: ref_29 article-title: Selection of Smoothing Parameter Estimators for General Regression Neural Networks-Applications to Hydrological and Water Resources Modelling publication-title: Environ. Model. Softw. doi: 10.1016/j.envsoft.2014.05.010 – volume: 4 start-page: 54 year: 2021 ident: ref_6 article-title: A Wearable Biosensing System with In-Sensor Adaptive Machine Learning for Hand Gesture publication-title: Nat. Electron. – volume: 27 start-page: 1230 year: 2017 ident: ref_19 article-title: On Space-Time Filtering Framework for Matching Human Actions Across Different Viewpoints publication-title: IEEE Trans. Image Processing doi: 10.1109/TIP.2017.2765821 – ident: ref_1 doi: 10.1109/AMC.2019.8371172 – ident: ref_5 doi: 10.1109/ICAEEE.2018.8642968 – volume: 66 start-page: 977 year: 2019 ident: ref_7 article-title: Developing a Three-to Six-State EEG-Based Brain-Computer Interface for a Virtual Robotic Manipulator Control publication-title: IEEE Trans. Biomed. Eng. doi: 10.1109/TBME.2018.2865941 – volume: 176 start-page: 498 year: 2017 ident: ref_2 article-title: The Modeling of Inverse Kinematics for 5 DOF Manipulator publication-title: Sci. Direct – volume: 3 start-page: 6343 year: 2020 ident: ref_28 article-title: Surface EMG Hand Gesture Recognition System Based on PCA and GRNN publication-title: Neural Comput. Appl. doi: 10.1007/s00521-019-04142-8 – volume: 10882 start-page: 373 year: 2018 ident: ref_13 article-title: Human Action Recognition Using Fusion of Depth and Inertial Sensors publication-title: Image Anal. Recognit. doi: 10.1007/978-3-319-93000-8_42 – volume: 19 start-page: 4662 year: 2019 ident: ref_24 article-title: Design and Speed-Adaptive Control of a Powered Geared Five-Bar Prosthetic Knee Using BP Neural Network Gait Recognition publication-title: Sensors doi: 10.3390/s19214662 – volume: 28 start-page: 508 year: 2019 ident: ref_16 article-title: Multi-grip Classification-Based Prosthesis Control with Two EMG-IMU Sensors publication-title: IEEE Trans. Neural Syst. Rehab. Eng. doi: 10.1109/TNSRE.2019.2959243 – ident: ref_27 doi: 10.1109/I2MTC43012.2020.9128722 – volume: 4 start-page: 12 year: 2021 ident: ref_3 article-title: Human-Machine Interfaces in Upper-Limb Prosthesis Control: A Survey of Techniques for Preprocessing and Processing of Biosignals publication-title: IEEE Signal Process. Mag. – volume: 12 start-page: 282 year: 2007 ident: ref_10 article-title: A Supervised Feature-Projection-Based Real-Time EMG Pattern Recognition for Multifunction Myoelectric Hand Control publication-title: IEEE/ASME Trans. Mechatron. doi: 10.1109/TMECH.2007.897262 – volume: 58 start-page: 1698 year: 2011 ident: ref_12 article-title: Selective Classification for Improved Robustness of Myoelectric Control Under Nonideal Conditions publication-title: IEEE Trans. Biomed. Eng. doi: 10.1109/TBME.2011.2113182 – volume: 2021 start-page: 6638038 year: 2021 ident: ref_22 article-title: Evaluation of an Information Flow Gain Algorithm for Microsensor Information Flow in Limber Motor Rehabilitation publication-title: Complexity doi: 10.1155/2021/6638038 – ident: ref_21 doi: 10.3390/s21165385 – volume: 44 start-page: 1 year: 2020 ident: ref_9 article-title: Improved Hand Prostheses Control for Transradial Amputees Based on Hybrid of Voice Recognition and Electromyography publication-title: Int. J. Artif. Organs – volume: 20 start-page: 5989 year: 2020 ident: ref_15 article-title: Upper Limb Physical Rehabilitation Using Serious Videogames and Motion Capture System: A Systematic Review publication-title: Sensors doi: 10.3390/s20215989 – ident: ref_18 doi: 10.1007/978-3-030-41117-6_33 – volume: 30 start-page: 198 year: 2017 ident: ref_8 article-title: Developing and Modeling of Voice Control System for Prosthetic Robot Arm in Medical Systems publication-title: Comput. Inf. Sci. – ident: ref_17 doi: 10.3390/s19224887 – volume: 19 start-page: 11687 year: 2019 ident: ref_4 article-title: Combining Bioimpedance and EMG Measurements for Reliable Muscle Contraction Detection publication-title: IEEE Sens. J. doi: 10.1109/JSEN.2019.2936171 – volume: 5 start-page: 7 year: 2021 ident: ref_14 article-title: CNN-based PD Hand Tremor Detection Using Inertial Sensors publication-title: IEEE Sens. Lett. doi: 10.1109/LSENS.2021.3074958 – volume: 8 start-page: 138743 year: 2020 ident: ref_20 article-title: Research on Basketball Shooting Action Based on Image Feature Extraction and Machine Learning publication-title: IEEE Access doi: 10.1109/ACCESS.2020.3012456 |
SSID | ssj0023338 |
Score | 2.4389536 |
Snippet | Using motion information of the upper limb to control the prosthetic hand has become a hotspot of current research. The operation of the prosthetic hand must... |
SourceID | doaj pubmedcentral proquest pubmed crossref |
SourceType | Open Website Open Access Repository Aggregation Database Index Database Enrichment Source |
StartPage | 1954 |
SubjectTerms | 10-fold cross-validation Accuracy action intention recognition of upper limb Algorithms Amputation Design dividing motion unit Electromyography - methods Hand inertial sensor Intention Machine learning Neural networks Neural Networks, Computer Prostheses prosthetic hand control Sensors Upper Extremity |
SummonAdditionalLinks | – databaseName: DOAJ Directory of Open Access Journals dbid: DOA link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV1LS8NAEF6kJz2Ib6NVonjwEsxjk90cW7EUsR7EQm9hH7MoaFq0_f_OJGlopeDFW9gHbGayme9LZr9h7AaDZCYQ1gcGAX7ANfIUbXMXWGlVaGyKjIy-Q46es-GYP07SyUqpL8oJq-WBa8PdxTwFcKk0kcy5kEanJNlmVGyEVkZYevtizFuSqYZqJci8ah2hBEn93TcdJyVts7XoU4n0b0KWvxMkVyLOYI_tNlDR79VL3GdbUB6wnRUBwUMmXpb5P9PSnzp_PJvBl__0_qn9XnVgwa9S1KurPsYr61PTaHzExoOH1_th0JRCCAxaax6kmkoKJeBA2wh0mGjFHWS5pTIzIJUFEwvhOCgeKWMyiywlV9rpUCptwCXHrFNOSzhlfqxyYazjITdIiyFSygoDMnI4Mcdmj90uTVSYRiecylV8FMgXyJpFa02PXbdDZ7U4xqZBfbJzO4D0rKsG9HLReLn4y8se6y69VDSb7LuIs4TU7xACeeyq7cbtQf88VAnTRTVGEg8PpcdOaqe2K0HsKRD94Gyx5u61pa73lO9vlQS3zJFZJtnZf9zbOduO6UwFJbbFXdaZfy3gApHOXF9WD_UPgF39qA priority: 102 providerName: Directory of Open Access Journals – databaseName: ProQuest Central dbid: BENPR link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwfV1Lb9QwEB5Be4ED4k2goIA4cImahxM7J9RFrSpEK1SxUm-RPR5DJUiW3e3_ZybrDV1UcYucsWTN2J757PE3AO_ZSTaaw_oMOcDPlGOc4nwbMm-8zdHXjMjkHPLsvDmdq8-X9WU8cFvFtMrtnjhu1H5AOSM_LJtKqM7Y331c_M6kapTcrsYSGndhn7dgwzN8f3Z8_vViglwVI7ANn1DF4P5wJc9KheNsxwuNZP23RZj_Jkre8DwnD-FBDBnTo42NH8Ed6h_D_RtEgk9AX2zzgIY-HUI6XyxomX65-uXSo_HhQjqmqo9fM_ZbPpWms_lTmJ8cf_t0msWSCBmy1tZZ7aS0UEWBnC_I5ZWzKlDTeik3Q8Z6wlLroMiqwiI2ntFKa11wubEOKVTPYK8fenoBaWlbjT6oXCHDYyqs9RrJFIE7ttycwIetijqMfOFStuJnx7hBtNlN2kzg3SS62JBk3CY0Ez1PAsJrPTYMy-9dXCZdqWqiUBssTKu0QVcLQR_aErWzqH0CB1srdXGxrbq_UyOBt9NvXiZy92F7Gq5HGSN4PDcJPN8YdRoJx6CaoyDurXfMvTPU3T_91Y-Ritu0jDCr5uX_h_UK7pXyakJS18oD2Fsvr-k1xzJr9yZO2D-r3_YE priority: 102 providerName: ProQuest |
Title | Recognition of Upper Limb Action Intention Based on IMU |
URI | https://www.ncbi.nlm.nih.gov/pubmed/35271101 https://www.proquest.com/docview/2637786431 https://www.proquest.com/docview/2638713308 https://pubmed.ncbi.nlm.nih.gov/PMC8914936 https://doaj.org/article/245eef58c189478cb57544ca2c7bac7d |
Volume | 22 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1Lb9QwEB71cYEDojwDZRUQBy6BPJzYOVSoi7pUiK1QRaS9RX6WSiVZtlsJ_n1nvEnUVHvgEkX2WLFmYs18tucbgPfoJAuOYX2kMcCPmEKcokzpIiOMjLXJEZHRPuT8rDit2LdFvtiBvsZmp8DrrdCO6klVq6uPf__8-4wL_ogQJ0L2T9eULErMZbuwjw6JUwWHORsOE9Is8wWtKacrQn8YbwiGxkNHbsmz928LOe_fnLzjimaP4VEXQ4bHG6MfwI5tnsDDO8yCT4Gf9xeD2iZsXVgtl3YVfr_8rcJjn8kQ-rvr_m2KjsyE1DSvnkE1O_n55TTqaiREGtW4jnJFtYYy66wyiVVxpiRztigN1Z-xQhqrU84ds5IlUuvCIHwppXIqFlJp67LnsNe0jX0JYSpLro1jMdOIl20ipeHaisThwBKbA_jQq6jWHYE41bG4qhFIkDbrQZsBvBtElxvWjG1CU9LzIEBE176hXV3U3bqpU5Zb63KhE1EyLrTKibFPy1RzJTU3ARz2Vqr7n6dOi4xo8TA2CuDt0I3rhg5DZGPbGy8jCKDHIoAXG6MOM8GglGNYhKP5yNyjqY57mstfnptblAg5s-LVf3z3NTxIKZeCLrSlh7C3Xt3YNxjhrNUEdvmC41PMvk5gf3py9uN84ncLJv7PvgUPnP42 |
linkProvider | Scholars Portal |
linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV1Lb9QwEB6VcoAeEM82UMAgkLhEzcOJnQNCLVBt6W4PqCvtLfgJlSBZdrdC_Cl-IzN5tYsqbr1Fjp1Y48fMZ898A_AKlWQu0KwPDRr4IdeIU7QtfGilVZGxGSIyOoecnOSjKf80y2Yb8KePhSG3yn5PbDZqWxs6I99L8pSozlDfvZv_DClrFN2u9ik02mlx7H7_Qsi2fHv0Acf3dZIcfjx9Pwq7rAKhwR-vwkxTdp7Ueadt7HSUasW9ywtLGVucVNaZRAjPneKxMia3aPAXSnsdSaWN8yl-9wbcxI5E5EIoZhcAL0W817IXpWkR7S0piJUY1dZ0XpMa4Cp79l-3zEt67vAu3OkMVLbfzqh7sOGq-7B1ibbwAYjPvddRXbHas-l87hZsfPZDs_0mTII1jvHN0wFqScuoaDJ9CNNrEdUj2Kzqyu0AS1QhjPU84gbBuIuVssI4GXtsWGBxAG96EZWmYyenJBnfS0QpJM1ykGYAL4eq85aS46pKByTnoQKxaDcF9eJr2S3KMuGZcz6TJpYFF9LojOgAjUqM0MoIG8BuP0plt7SX5cVEDODF8BoXJd20qMrV500dSeg_kgFst4M69AQtXoE2F7YWa8O91tX1N9XZt4b4WxaIZ9P88f-79RxujU4n43J8dHL8BG4nFK9BTnPJLmyuFufuKVpRK_2smboMvlz3WvkLP2g1EA |
linkToPdf | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV1Lb9RADLZKkRAcEG9SCgQEEpdok5kkMzkg1FJWLX0IIVbaW5gnVIJku7tV1b_Gr8POq11Ucestmkc08thje8b-DPAGlWQu0KyPDBr4UarRT9G28JGVVsXGZuiR0T3k4VG-O0k_T7PpGvzpc2EorLI_E5uD2taG7shHLOcEdYb6buS7sIgvO-MPs5OIKkjRS2tfTqNlkX13fobu2-L93g7u9VvGxp--fdyNugoDkcFFLKNMU6Ue7rzTNnE65lql3uWFpeotTirrDBPCp06liTImt2j8F0p7HUuljfMc_3sDbgqOahNlSUwvnD2Ovl-LZMR5EY8WlNBK6Gor-q8pE3CVbftviOYlnTe-B3c7YzXcarnrPqy56gHcuQRh-BDE1z4Cqa7C2oeT2czNw4Pj3zrcalImwiZIvvnaRo1pQ2o6nDyCybWQ6jGsV3XlnkLIVCGM9WmcGnTMXaKUFcbJxOPEApsDeNeTqDQdUjkVzPhVosdC1CwHagbwehg6a-E5rhq0TXQeBhCidtNQz3-UnYCWLM2c85k0iSxSIY3OCBrQKGaEVkbYADb7XSo7MV-UF0wZwKuhGwWUXl1U5erTZoykm4BYBvCk3dRhJWj9CrS_cLZY2e6Vpa72VMc_GxBwWaBvy_ON_y_rJdxCKSkP9o72n8FtRqkbFD_HNmF9OT91z9GgWuoXDeeG8P26ReUvNwY5Rg |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Recognition+of+Upper+Limb+Action+Intention+Based+on+IMU&rft.jtitle=Sensors+%28Basel%2C+Switzerland%29&rft.au=Cui%2C+Jian-Wei&rft.au=Li%2C+Zhi-Gang&rft.au=Du%2C+Han&rft.au=Yan%2C+Bing-Yan&rft.date=2022-03-02&rft.issn=1424-8220&rft.eissn=1424-8220&rft.volume=22&rft.issue=5&rft_id=info:doi/10.3390%2Fs22051954&rft.externalDBID=NO_FULL_TEXT |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1424-8220&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1424-8220&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1424-8220&client=summon |