Human Posture Transition-Time Detection Based upon Inertial Measurement Unit and Long Short-Term Memory Neural Networks
As human–robot interaction becomes more prevalent in industrial and clinical settings, detecting changes in human posture has become increasingly crucial. While recognizing human actions has been extensively studied, the transition between different postures or movements has been largely overlooked....
Saved in:
Published in | Biomimetics (Basel, Switzerland) Vol. 8; no. 6; p. 471 |
---|---|
Main Authors | , , , , , , |
Format | Journal Article |
Language | English |
Published |
Basel
MDPI AG
01.10.2023
MDPI |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Abstract | As human–robot interaction becomes more prevalent in industrial and clinical settings, detecting changes in human posture has become increasingly crucial. While recognizing human actions has been extensively studied, the transition between different postures or movements has been largely overlooked. This study explores using two deep-learning methods, the linear Feedforward Neural Network (FNN) and Long Short-Term Memory (LSTM), to detect changes in human posture among three different movements: standing, walking, and sitting. To explore the possibility of rapid posture-change detection upon human intention, the authors introduced transition stages as distinct features for the identification. During the experiment, the subject wore an inertial measurement unit (IMU) on their right leg to measure joint parameters. The measurement data were used to train the two machine learning networks, and their performances were tested. This study also examined the effect of the sampling rates on the LSTM network. The results indicate that both methods achieved high detection accuracies. Still, the LSTM model outperformed the FNN in terms of speed and accuracy, achieving 91% and 95% accuracy for data sampled at 25 Hz and 100 Hz, respectively. Additionally, the network trained for one test subject was able to detect posture changes in other subjects, demonstrating the feasibility of personalized or generalized deep learning models for detecting human intentions. The accuracies for posture transition time and identification at a sampling rate of 100 Hz were 0.17 s and 94.44%, respectively. In summary, this study achieved some good outcomes and laid a crucial foundation for the engineering application of digital twins, exoskeletons, and human intention control. |
---|---|
AbstractList | As human–robot interaction becomes more prevalent in industrial and clinical settings, detecting changes in human posture has become increasingly crucial. While recognizing human actions has been extensively studied, the transition between different postures or movements has been largely overlooked. This study explores using two deep-learning methods, the linear Feedforward Neural Network (FNN) and Long Short-Term Memory (LSTM), to detect changes in human posture among three different movements: standing, walking, and sitting. To explore the possibility of rapid posture-change detection upon human intention, the authors introduced transition stages as distinct features for the identification. During the experiment, the subject wore an inertial measurement unit (IMU) on their right leg to measure joint parameters. The measurement data were used to train the two machine learning networks, and their performances were tested. This study also examined the effect of the sampling rates on the LSTM network. The results indicate that both methods achieved high detection accuracies. Still, the LSTM model outperformed the FNN in terms of speed and accuracy, achieving 91% and 95% accuracy for data sampled at 25 Hz and 100 Hz, respectively. Additionally, the network trained for one test subject was able to detect posture changes in other subjects, demonstrating the feasibility of personalized or generalized deep learning models for detecting human intentions. The accuracies for posture transition time and identification at a sampling rate of 100 Hz were 0.17 s and 94.44%, respectively. In summary, this study achieved some good outcomes and laid a crucial foundation for the engineering application of digital twins, exoskeletons, and human intention control. As human-robot interaction becomes more prevalent in industrial and clinical settings, detecting changes in human posture has become increasingly crucial. While recognizing human actions has been extensively studied, the transition between different postures or movements has been largely overlooked. This study explores using two deep-learning methods, the linear Feedforward Neural Network (FNN) and Long Short-Term Memory (LSTM), to detect changes in human posture among three different movements: standing, walking, and sitting. To explore the possibility of rapid posture-change detection upon human intention, the authors introduced transition stages as distinct features for the identification. During the experiment, the subject wore an inertial measurement unit (IMU) on their right leg to measure joint parameters. The measurement data were used to train the two machine learning networks, and their performances were tested. This study also examined the effect of the sampling rates on the LSTM network. The results indicate that both methods achieved high detection accuracies. Still, the LSTM model outperformed the FNN in terms of speed and accuracy, achieving 91% and 95% accuracy for data sampled at 25 Hz and 100 Hz, respectively. Additionally, the network trained for one test subject was able to detect posture changes in other subjects, demonstrating the feasibility of personalized or generalized deep learning models for detecting human intentions. The accuracies for posture transition time and identification at a sampling rate of 100 Hz were 0.17 s and 94.44%, respectively. In summary, this study achieved some good outcomes and laid a crucial foundation for the engineering application of digital twins, exoskeletons, and human intention control.As human-robot interaction becomes more prevalent in industrial and clinical settings, detecting changes in human posture has become increasingly crucial. While recognizing human actions has been extensively studied, the transition between different postures or movements has been largely overlooked. This study explores using two deep-learning methods, the linear Feedforward Neural Network (FNN) and Long Short-Term Memory (LSTM), to detect changes in human posture among three different movements: standing, walking, and sitting. To explore the possibility of rapid posture-change detection upon human intention, the authors introduced transition stages as distinct features for the identification. During the experiment, the subject wore an inertial measurement unit (IMU) on their right leg to measure joint parameters. The measurement data were used to train the two machine learning networks, and their performances were tested. This study also examined the effect of the sampling rates on the LSTM network. The results indicate that both methods achieved high detection accuracies. Still, the LSTM model outperformed the FNN in terms of speed and accuracy, achieving 91% and 95% accuracy for data sampled at 25 Hz and 100 Hz, respectively. Additionally, the network trained for one test subject was able to detect posture changes in other subjects, demonstrating the feasibility of personalized or generalized deep learning models for detecting human intentions. The accuracies for posture transition time and identification at a sampling rate of 100 Hz were 0.17 s and 94.44%, respectively. In summary, this study achieved some good outcomes and laid a crucial foundation for the engineering application of digital twins, exoskeletons, and human intention control. |
Audience | Academic |
Author | Wang, Fu-Cheng Kuo, Chun-Ting Yen, Jia-Yush Jen, Kuo-Kuang Hsu, Wei-Li Tsao, Tsu-Chin Lin, Jun-Ji |
AuthorAffiliation | 2 Missile and Rocket Research Division, National Chung Shan Institute of Science and Technology, Taoyuan 325204, Taiwan 3 School and Graduate Institute of Physical Therapy, National Taiwan University, Taipei 106319, Taiwan 1 Department of Mechanical Engineering, National Taiwan University, Taipei 106319, Taiwan 5 Department of Mechanical Engineering, National Taiwan University of Science and Technology, Taipei 106319, Taiwan 4 Mechanical and Aerospace Engineering, Samueli School of Engineering, UCLA, Los Angeles, CA 90095, USA |
AuthorAffiliation_xml | – name: 3 School and Graduate Institute of Physical Therapy, National Taiwan University, Taipei 106319, Taiwan – name: 2 Missile and Rocket Research Division, National Chung Shan Institute of Science and Technology, Taoyuan 325204, Taiwan – name: 1 Department of Mechanical Engineering, National Taiwan University, Taipei 106319, Taiwan – name: 5 Department of Mechanical Engineering, National Taiwan University of Science and Technology, Taipei 106319, Taiwan – name: 4 Mechanical and Aerospace Engineering, Samueli School of Engineering, UCLA, Los Angeles, CA 90095, USA |
Author_xml | – sequence: 1 givenname: Chun-Ting orcidid: 0000-0002-5783-8141 surname: Kuo fullname: Kuo, Chun-Ting – sequence: 2 givenname: Jun-Ji orcidid: 0000-0002-5151-113X surname: Lin fullname: Lin, Jun-Ji – sequence: 3 givenname: Kuo-Kuang surname: Jen fullname: Jen, Kuo-Kuang – sequence: 4 givenname: Wei-Li orcidid: 0000-0003-4577-140X surname: Hsu fullname: Hsu, Wei-Li – sequence: 5 givenname: Fu-Cheng orcidid: 0000-0001-5011-7934 surname: Wang fullname: Wang, Fu-Cheng – sequence: 6 givenname: Tsu-Chin surname: Tsao fullname: Tsao, Tsu-Chin – sequence: 7 givenname: Jia-Yush surname: Yen fullname: Yen, Jia-Yush |
BookMark | eNp9ks1u1DAUhS1URMvQF2AViQ2bFDuO42SFSvnpSENBYrq2HOdm6iGxB9uh6ttzwxTBFIQSKY5zvuMc-zwlR847IOQ5o2ecN_RVa_1oR0jWxJpWtJTsETkpOOO5rCQ_-mN8TE5j3FJKWVOJsqRPyDGXdS0rWpyQ28tp1C777GOaAmTroF20yXqXr9E9ewsJzPyavdERumza4XDpICSrh-wj6IjUCC5l186mTLsuW3m3yb7c-JDyNYQRRaMPd9kVTAGRK0i3PnyNz8jjXg8RTu-fC3L9_t364jJfffqwvDhf5UaULOWNafDWnWg64KC54SANZ6IpgNK-q9peasEaXTcFN5RJKFpBERGS6x51fEGWe9_O663aBTvqcKe8turnhA8bpTGMGUCVheFt1zFRG13WrGoLIxmIVletEFRw9Hq999pN7QidwdgY6cD08IuzN2rjvys2HxDnFB1e3jsE_22CmNRoo4Fh0A78FFVR11xIKcX84y8eSLd-Cg73alYVXEo8xN-qjcYE1vUeFzazqTqXkjWMSazBgpz9Q4VXB6M12Kve4vwBUO8BE3yMAXplbNJzDxC0AwZScwnV3yVEtHiA_tqf_0A_AIBe5AU |
CitedBy_id | crossref_primary_10_3390_biomimetics8080591 crossref_primary_10_1007_s12206_024_0731_7 crossref_primary_10_3390_biomimetics9050263 crossref_primary_10_3390_act13080284 crossref_primary_10_3390_s24020686 |
Cites_doi | 10.1109/ACCESS.2020.3010644 10.1109/SURV.2012.110112.00192 10.1109/ICCV.2013.441 10.3389/fbioe.2020.00664 10.1007/978-3-031-38854-5 10.3390/s22041690 10.3390/app12147243 10.5220/0010896400003123 10.1007/s11263-022-01594-9 10.1016/j.imavis.2017.01.010 10.1109/TNSRE.2021.3087135 10.1109/IJCNN55064.2022.9892963 10.23919/EUSIPCO54536.2021.9616298 10.1016/j.inffus.2021.11.006 10.3390/app112412101 10.1109/TNSRE.2021.3086843 10.5220/0011927700003414 10.1007/s40846-021-00634-y 10.1016/j.engappai.2023.105855 10.1080/01691864.2018.1490200 10.1016/j.gaitpost.2018.04.015 10.1088/2058-8585/ac6a96 10.1109/ICPECA53709.2022.9719317 10.1109/MeMeA57477.2023.10171888 10.1155/2020/8024789 10.1016/j.engappai.2022.105655 10.3390/act11030073 10.1016/j.measurement.2022.111442 10.1007/s10846-019-01049-3 10.1109/TPAMI.2007.70711 10.1016/j.engappai.2022.105702 10.3390/electronics11091320 10.1109/MCE.2016.2614423 10.1109/34.910878 10.3390/bios12121182 10.1016/j.procir.2021.03.088 10.3390/electronics9122176 10.1016/j.patrec.2018.02.010 10.3389/frobt.2021.749274 10.1609/aaai.v32i1.12328 10.3390/s21165253 |
ContentType | Journal Article |
Copyright | COPYRIGHT 2023 MDPI AG 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. 2023 by the authors. 2023 |
Copyright_xml | – notice: COPYRIGHT 2023 MDPI AG – notice: 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. – notice: 2023 by the authors. 2023 |
DBID | AAYXX CITATION 8FE 8FH ABUWG AFKRA AZQEC BBNVY BENPR BHPHI CCPQU DWQXO GNUQQ HCIFZ LK8 M7P PHGZM PHGZT PIMPY PKEHL PQEST PQGLB PQQKQ PQUKI 7X8 5PM DOA |
DOI | 10.3390/biomimetics8060471 |
DatabaseName | CrossRef ProQuest SciTech Collection ProQuest Natural Science Collection ProQuest Central (Alumni) ProQuest Central UK/Ireland ProQuest Central Essentials Biological Science Collection ProQuest Central Natural Science Collection ProQuest One ProQuest Central Korea ProQuest Central Student SciTech Collection (ProQuest) Biological Sciences Biological Science Database (ProQuest) ProQuest Central Premium ProQuest One Academic (New) ProQuest Publicly Available Content Database ProQuest One Academic Middle East (New) ProQuest One Academic Eastern Edition (DO NOT USE) ProQuest One Applied & Life Sciences ProQuest One Academic ProQuest One Academic UKI Edition MEDLINE - Academic PubMed Central (Full Participant titles) DOAJ: Directory of Open Access Journal (DOAJ) |
DatabaseTitle | CrossRef Publicly Available Content Database ProQuest Central Student ProQuest One Academic Middle East (New) ProQuest Biological Science Collection ProQuest Central Essentials ProQuest One Academic Eastern Edition ProQuest Central (Alumni Edition) SciTech Premium Collection ProQuest One Community College ProQuest Natural Science Collection Biological Science Database ProQuest SciTech Collection ProQuest Central ProQuest One Applied & Life Sciences ProQuest One Academic UKI Edition Natural Science Collection ProQuest Central Korea Biological Science Collection ProQuest Central (New) ProQuest One Academic ProQuest One Academic (New) MEDLINE - Academic |
DatabaseTitleList | CrossRef Publicly Available Content Database MEDLINE - Academic |
Database_xml | – sequence: 1 dbid: DOA name: DOAJ Directory of Open Access Journals url: https://www.doaj.org/ sourceTypes: Open Website – sequence: 2 dbid: BENPR name: ProQuest Central url: https://www.proquest.com/central sourceTypes: Aggregation Database |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Anatomy & Physiology |
EISSN | 2313-7673 |
ExternalDocumentID | oai_doaj_org_article_42c3bdd158ca4816b2c71e5ba6b55053 PMC10604330 A771911731 10_3390_biomimetics8060471 |
GeographicLocations | Taiwan |
GeographicLocations_xml | – name: Taiwan |
GrantInformation_xml | – fundername: the National Science and Technology Council, R.O.C grantid: 108-2221-E-011 -166 -MY3 |
GroupedDBID | 53G 8FE 8FH AADQD AAFWJ AAYXX ABDBF ADBBV AFKRA AFPKN AFZYC ALMA_UNASSIGNED_HOLDINGS AOIJS BBNVY BCNDV BENPR BHPHI CCPQU CITATION GROUPED_DOAJ HCIFZ HYE IAO IHR INH ITC LK8 M7P MODMG M~E OK1 PGMZT PHGZM PHGZT PIMPY PROAC RPM PMFND ABUWG AZQEC DWQXO GNUQQ PKEHL PQEST PQGLB PQQKQ PQUKI 7X8 5PM PUEGO |
ID | FETCH-LOGICAL-c541t-9c99c9ad59de3ea3c3e7c31592e00fd6bf7a519a8923c017e2b5099c573af3153 |
IEDL.DBID | DOA |
ISSN | 2313-7673 |
IngestDate | Wed Aug 27 01:24:13 EDT 2025 Thu Aug 21 18:36:18 EDT 2025 Fri Jul 11 00:02:20 EDT 2025 Fri Jul 25 11:49:16 EDT 2025 Tue Jun 17 22:23:25 EDT 2025 Tue Jun 10 21:16:43 EDT 2025 Thu Apr 24 23:04:31 EDT 2025 Tue Jul 01 04:26:22 EDT 2025 |
IsDoiOpenAccess | true |
IsOpenAccess | true |
IsPeerReviewed | true |
IsScholarly | true |
Issue | 6 |
Language | English |
License | https://creativecommons.org/licenses/by/4.0 Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-c541t-9c99c9ad59de3ea3c3e7c31592e00fd6bf7a519a8923c017e2b5099c573af3153 |
Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 content type line 23 |
ORCID | 0000-0003-4577-140X 0000-0002-5151-113X 0000-0001-5011-7934 0000-0002-5783-8141 |
OpenAccessLink | https://doaj.org/article/42c3bdd158ca4816b2c71e5ba6b55053 |
PMID | 37887602 |
PQID | 2882377378 |
PQPubID | 2055439 |
ParticipantIDs | doaj_primary_oai_doaj_org_article_42c3bdd158ca4816b2c71e5ba6b55053 pubmedcentral_primary_oai_pubmedcentral_nih_gov_10604330 proquest_miscellaneous_2883577755 proquest_journals_2882377378 gale_infotracmisc_A771911731 gale_infotracacademiconefile_A771911731 crossref_citationtrail_10_3390_biomimetics8060471 crossref_primary_10_3390_biomimetics8060471 |
ProviderPackageCode | CITATION AAYXX |
PublicationCentury | 2000 |
PublicationDate | 2023-10-01 |
PublicationDateYYYYMMDD | 2023-10-01 |
PublicationDate_xml | – month: 10 year: 2023 text: 2023-10-01 day: 01 |
PublicationDecade | 2020 |
PublicationPlace | Basel |
PublicationPlace_xml | – name: Basel |
PublicationTitle | Biomimetics (Basel, Switzerland) |
PublicationYear | 2023 |
Publisher | MDPI AG MDPI |
Publisher_xml | – name: MDPI AG – name: MDPI |
References | Dallel (ref_3) 2023; 118 Lee (ref_7) 2022; 7 Chen (ref_31) 2021; 54 ref_14 ref_36 Archetti (ref_32) 2020; 2 Qiu (ref_47) 2022; 80 Bobick (ref_12) 2001; 23 ref_33 Ding (ref_8) 2020; 8 ref_19 Gorelick (ref_13) 2007; 29 ref_17 Ito (ref_18) 2018; 32 ref_39 ref_38 He (ref_37) 2020; 2020 Lara (ref_1) 2013; 15 Khodabandelou (ref_15) 2023; 118 Papi (ref_16) 2018; 62 Bruinsma (ref_20) 2021; 29 Peng (ref_40) 2022; 39 Younsi (ref_11) 2023; 120 Piche (ref_46) 2022; 198 Lee (ref_6) 2021; 41 Zhang (ref_5) 2019; 2019 ref_23 ref_45 ref_22 ref_44 ref_21 Liu (ref_41) 2021; 29 ref_43 ref_42 Wang (ref_30) 2019; 119 Li (ref_34) 2020; 97 ref_2 Lindemann (ref_35) 2021; 99 ref_28 ref_27 ref_26 ref_48 ref_9 Harris (ref_24) 2022; 8 Kong (ref_29) 2022; 130 Cangelosi (ref_4) 2017; 6 Herath (ref_10) 2017; 60 Kececi (ref_25) 2020; 23 |
References_xml | – volume: 8 start-page: 138642 year: 2020 ident: ref_8 article-title: Control of Walking Assist Exoskeleton with Time-delay Based on the Prediction of Plantar Force publication-title: IEEE Access doi: 10.1109/ACCESS.2020.3010644 – volume: 15 start-page: 1192 year: 2013 ident: ref_1 article-title: A Survey on Human Activity Recognition using Wearable Sensors publication-title: IEEE Commun. Surv. Tutor. doi: 10.1109/SURV.2012.110112.00192 – ident: ref_9 doi: 10.1109/ICCV.2013.441 – ident: ref_2 doi: 10.3389/fbioe.2020.00664 – ident: ref_26 doi: 10.1007/978-3-031-38854-5 – ident: ref_22 doi: 10.3390/s22041690 – volume: 2019 start-page: 3679174 year: 2019 ident: ref_5 article-title: sEMG Based Human Motion Intention Recognition publication-title: J. Robot. – ident: ref_39 doi: 10.3390/app12147243 – ident: ref_48 doi: 10.5220/0010896400003123 – volume: 130 start-page: 1366 year: 2022 ident: ref_29 article-title: Human Action Recognition and Prediction: A Survey publication-title: Int. J. Comput. Vis. doi: 10.1007/s11263-022-01594-9 – volume: 60 start-page: 4 year: 2017 ident: ref_10 article-title: Going deeper into action recognition: A survey publication-title: Image Vis. Comput. doi: 10.1016/j.imavis.2017.01.010 – volume: 29 start-page: 1089 year: 2021 ident: ref_41 article-title: A Muscle Synergy-Inspired Method of Detecting Human Movement Intentions Based on Wearable Sensor Fusion publication-title: IEEE Trans. Neural Syst. Rehabil. Eng. doi: 10.1109/TNSRE.2021.3087135 – ident: ref_36 doi: 10.1109/IJCNN55064.2022.9892963 – ident: ref_28 doi: 10.23919/EUSIPCO54536.2021.9616298 – volume: 80 start-page: 241 year: 2022 ident: ref_47 article-title: Multi-sensor information fusion based on machine learning for real applications in human activity recognition: State-of-the-art and research challenges publication-title: Inf. Fusion doi: 10.1016/j.inffus.2021.11.006 – ident: ref_43 doi: 10.3390/app112412101 – volume: 29 start-page: 1079 year: 2021 ident: ref_20 article-title: IMU-Based Deep Neural Networks: Prediction of Locomotor and Transition Intentions of an Osseointegrated Transfemoral Amputee publication-title: IEEE Trans. Neural Syst. Rehabil. Eng. doi: 10.1109/TNSRE.2021.3086843 – ident: ref_21 doi: 10.5220/0011927700003414 – volume: 41 start-page: 856 year: 2021 ident: ref_6 article-title: Realization of Natural Human Motion on a 3D Biped Robot For Studying the Exoskeleton Effective publication-title: J. Med. Biol. Eng. doi: 10.1007/s40846-021-00634-y – volume: 120 start-page: 105855 year: 2023 ident: ref_11 article-title: Comparative study of orthogonal moments for human postures recognition publication-title: Eng. Appl. Artif. Intell. doi: 10.1016/j.engappai.2023.105855 – ident: ref_23 – volume: 32 start-page: 635 year: 2018 ident: ref_18 article-title: Evaluation of active wearable assistive devices with human posture reproduction using a humanoid robot publication-title: Adv. Robot. doi: 10.1080/01691864.2018.1490200 – volume: 62 start-page: 480 year: 2018 ident: ref_16 article-title: A flexible wearable sensor for knee flexion assessment during gait publication-title: Gait Posture doi: 10.1016/j.gaitpost.2018.04.015 – volume: 7 start-page: 023002 year: 2022 ident: ref_7 article-title: Recent advances in wearable exoskeletons for human strength augmentation publication-title: Flex. Print. Electron. doi: 10.1088/2058-8585/ac6a96 – volume: 2 start-page: 13 year: 2020 ident: ref_32 article-title: Inclusive Human Intention Prediction with Wearable Sensors: Machine Learning Techniques for the Reaching Task Use Case publication-title: Eng. Proc. – ident: ref_42 doi: 10.1109/ICPECA53709.2022.9719317 – ident: ref_45 doi: 10.1109/MeMeA57477.2023.10171888 – volume: 2020 start-page: 8024789 year: 2020 ident: ref_37 article-title: An LSTM-Based Prediction Method for Lower Limb Intention Perception by Integrative Analysis of Kinect Visual Signal publication-title: J. Healthc. Eng. doi: 10.1155/2020/8024789 – volume: 118 start-page: 105655 year: 2023 ident: ref_3 article-title: Digital twin of an industrial workstation: A novel method of an auto-labeled data generator using virtual reality for human action recognition in the context of human–robot collaboration publication-title: Eng. Appl. Artif. Intell. doi: 10.1016/j.engappai.2022.105655 – ident: ref_38 doi: 10.3390/act11030073 – volume: 198 start-page: 111442 year: 2022 ident: ref_46 article-title: Validity and repeatability of a new inertial measurement unit system for gait analysis on kinematic parameters: Comparison with an optoelectronic system publication-title: Measurement doi: 10.1016/j.measurement.2022.111442 – volume: 97 start-page: 95 year: 2020 ident: ref_34 article-title: Deep-Learning-Based Human Intention Prediction Using RGB Images and Optical Flow publication-title: J. Intell. Robot. Syst. doi: 10.1007/s10846-019-01049-3 – volume: 29 start-page: 2247 year: 2007 ident: ref_13 article-title: Actions as Space-Time Shapes publication-title: IEEE Trans. Pattern Anal. Mach. Intell. doi: 10.1109/TPAMI.2007.70711 – volume: 23 start-page: 931 year: 2020 ident: ref_25 article-title: Implementation of machine learning algorithms for gait recognition publication-title: Eng. Sci. Technol. Int. J. – volume: 118 start-page: 105702 year: 2023 ident: ref_15 article-title: A fuzzy convolutional attention-based GRU network for human activity recognition publication-title: Eng. Appl. Artif. Intell. doi: 10.1016/j.engappai.2022.105702 – ident: ref_44 doi: 10.3390/electronics11091320 – volume: 6 start-page: 24 year: 2017 ident: ref_4 article-title: Human-Robot Interaction and Neuroprosthetics: A review of new technologies publication-title: IEEE Consum. Electron. Mag. doi: 10.1109/MCE.2016.2614423 – volume: 23 start-page: 257 year: 2001 ident: ref_12 article-title: The recognition of human movement using temporal templates publication-title: IEEE Trans. Pattern Anal. Mach. Intell. doi: 10.1109/34.910878 – ident: ref_17 – ident: ref_27 doi: 10.3390/bios12121182 – volume: 99 start-page: 650 year: 2021 ident: ref_35 article-title: A survey on long short-term memory networks for time series prediction publication-title: Procedia CIRP doi: 10.1016/j.procir.2021.03.088 – ident: ref_19 doi: 10.3390/electronics9122176 – volume: 119 start-page: 3 year: 2019 ident: ref_30 article-title: Deep learning for sensor-based activity recognition: A survey publication-title: Pattern Recognit. Lett. doi: 10.1016/j.patrec.2018.02.010 – volume: 8 start-page: 749274 year: 2022 ident: ref_24 article-title: A Survey of Human Gait-Based Artificial Intelligence Applications publication-title: Front. Robot. AI doi: 10.3389/frobt.2021.749274 – volume: 54 start-page: 77 year: 2021 ident: ref_31 article-title: Deep Learning for Sensor-based Human Activity Recognition: Overview, Challenges, and Opportunities publication-title: ACM Comput. Surv. – ident: ref_14 doi: 10.1609/aaai.v32i1.12328 – volume: 39 start-page: 8 year: 2022 ident: ref_40 article-title: A survey of feature extraction methods in human action recognition publication-title: Comput. Appl. Softwire – ident: ref_33 doi: 10.3390/s21165253 |
SSID | ssj0001965440 |
Score | 2.2745435 |
Snippet | As human–robot interaction becomes more prevalent in industrial and clinical settings, detecting changes in human posture has become increasingly crucial.... As human-robot interaction becomes more prevalent in industrial and clinical settings, detecting changes in human posture has become increasingly crucial.... |
SourceID | doaj pubmedcentral proquest gale crossref |
SourceType | Open Website Open Access Repository Aggregation Database Enrichment Source Index Database |
StartPage | 471 |
SubjectTerms | Accuracy Algorithms Artificial intelligence Deep learning Discriminant analysis Electromyography Exoskeleton Feedback feedforward neural network (FNN) Gait human posture change detection inertial measurement unit (IMU) internal sensing Long short-term memory long short-term memory (LSTM) Machine learning Measurement Neural networks Performance evaluation Posture Robots Sampling Sensors Statistical methods |
SummonAdditionalLinks | – databaseName: ProQuest Central dbid: BENPR link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwfV3db9MwELege-EFAQMRGMhICB5QtCSOY-cJtbBpIFYh2KS9Rf7KNok6pR9C---5c9yWMGlSHqr4LCc53_nuevc7Qt6qPKtdhdkNrpJpWXGXKi1U2ipnhFXasIDTfTqtTs7Lrxf8IgbcljGtcqMTg6K2ncEY-WEhEVZFMCE_zn-n2DUK_12NLTTukz1QwVKOyN7kaPr9xy7KUle8LLO-WoaBf3-IVe3XMywQXEoEjhH54EQKwP231fP_KZP_nEHHj8jDaDzScc_tx-Se80_I_tiD4zy7oe9oSOcMcfJ98ieE5yk2410vHA1nUkjPSrHqg352q5CE5ekEzjFL13P4-cVjljWscLqLHFK0Sqnyln7r_CX9eQX2enoG-hyIZt3ihiK8B0yZ9vnky6fk_Pjo7NNJGrsspIaX-SqtTQ2Xsry2jjnFDHMCOMTrwmVZayvdCgVmnpJgChqQX1doMDJqwwVTLdCxZ2TkO--eE9pqyUBpSZsJC4ZBVrfGCVuJ0hpTaeESkm--dGMiBDl2wvjVgCuC3GlucychH7Zz5j0Ax53UE2TglhLBs8ONbnHZRFlsysIwbW3OJUK655UujMgd16rS6K-xhLxH9jco4vB4RsVKBXhJBMtqxkKAl5sLBssdDChBNM1weLOBmqgals1uIyfkzXYYZ2K6m3fdOtAwLoTgPCFysPEGbzYc8ddXAR48x0_BWPbi7tVfkgcFGGx9YuIBGa0Wa_cKDKyVfh2l6C9YTivR priority: 102 providerName: ProQuest |
Title | Human Posture Transition-Time Detection Based upon Inertial Measurement Unit and Long Short-Term Memory Neural Networks |
URI | https://www.proquest.com/docview/2882377378 https://www.proquest.com/docview/2883577755 https://pubmed.ncbi.nlm.nih.gov/PMC10604330 https://doaj.org/article/42c3bdd158ca4816b2c71e5ba6b55053 |
Volume | 8 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV3da9RAEF-kvvgiahVja1mh6IOEJtlsJnm82B6t2EO0hb6F_Ypt6e2V-0D63zuzSc-LBX0RDnJkZ0myM7vzm2Xmt4ztqzSpXEHZDa4o47yQLlYaVNwqZ8AqbUTg6T6dFMfn-ecLebFx1BflhHX0wN3AHeSZEdraVJbEv50WOjOQOqlVoQlcB55P9HkbwdR1R_oi8zzpqmQExvUHVM1-NaXCwEVJhDGQDjxRIOx_uCz_mSq54XvGz9jTHjTyUfeyz9kj51-w7ZHHgHl6x9_zkMYZ9se32c-wLc_pEN7V3PHgi0JaVkzVHvzQLUPylec1-i_LV7f498RTdjU-4fT3jiEnNMqVt_zLzP_g3y8Rp8dnuI6j0HQ2v-NE64FdJl0e-eIlOx8fnX06jvvTFWIj83QZV6bCn7Kysk44JYxwgJqRVeaSpLWFbkEhvFMlQkCD89ZlGsFFZSQI1aKceMW2_My714y3uhS4WJU2AYuAIKla48AWkFtjCg0uYun9SDempx6nEzBuGgxBSDvNQ-1E7OO6z21HvPFX6ZoUuJYk0uxwA02p6U2p-ZcpRewDqb-hqY2vZ1RfoYAfSSRZzQgAo9sUBD5udyCJU9IMm-8NqOmXhEWTlcQLBALKiL1bN1NPSnPzbrYKMkICgJQRKweGN_iyYYu_ugy04CkNhRDJm_8xFjvsSYZwrktb3GVby_nKvUX4tdR77PGoPqzHeK2PJl-_7YWZ9wve-DXT |
linkProvider | Directory of Open Access Journals |
linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV1Lb9NAEF6V9AAXBBREoMAi8Tggq7bX67UPCCW0VUKTCEEq9ebuy20lYoc8VOVP8RuZWTsJplJvlXKwsrNee2d3HuuZbwh5JwM_tTFGN9g48aKYW08qIb1cWi2MVJo5nO7hKO6dRt_O-NkO-bPOhcGwyrVMdILalBrPyA_CBGFVBBPJl-lvD6tG4dfVdQmNalmc2NU1uGzzz_1D4O_7MDw-Gn_teXVVAU_zKFh4qU7hJw1PjWVWMs2sgCfiaWh9PzexyoUEs0YmYPpoWK82VKBUU80FkznQMbjvPbIbsdgPW2S3ezT6_mN7qpPGPIr8KjuHsdQ_wCz6qwkmJM4TBKoRQUMDukIBN9XB_yGa_-i840fkYW2s0k61uh6THVs8IXudAhz1yYp-oC581J3L75Fr9zmAYvHf5cxSpwNdOJiHWSb00C5c0FdBu6A3DV1O4bJfYFQ3jDDcnlRStIKpLAwdlMUF_XkJ_oE3Bv0BRJNytqIIJwJdRlX8-vwpOb2T-X9GWkVZ2OeE5iphICQT4wsDhoif5toKE4vIaB0rYdskWM90pmvIc6y88SsD1we5k93kTpt82vSZVoAft1J3kYEbSgTrdn-Us4us3vtZFGqmjAl4ghDyQaxCLQLLlYwV-oesTT4i-zMUKfB4WtaZEfCSCM6VdYQArzoQDIbbb1CCKNDN5vUCympRNM-2G6dN3m6asSeG1xW2XDoaxoUQnLdJ0lh4jTdrthRXlw6OPMCpYMx_cfvob8j93ng4yAb90clL8iAEY7EKitwnrcVsaV-BcbdQr-sdRcn5XW_iv1nTaIw |
linkToPdf | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV3db9MwELdGJyFeEDAQZQOMxMcDiprEcZw8INTSVSvbqgk2aW_BX9km0aTrh6b-a_x13DlpS5i0t0l9iOpznPjs-3B-d0fIexn4qY0R3WDjxItibj2phPRyabUwUmnm8nQfj-KDs-j7OT_fIn9WsTAIq1zJRCeoTanxjLwTJphWRTCRdPIaFnHSH3ydXHtYQQq_tK7KaVRL5NAub8B9m30Z9oHXH8JwsH_67cCrKwx4mkfB3Et1Cj9peGoss5JpZgU8HU9D6_u5iVUuJJg4MgEzSMPataECBZtqLpjMgY7BfR-QbYFeUYts9_ZHJz82JzxpzKPIryJ1GEv9DkbUX40xOHGWYNIaETS0oSsacFs1_A_X_Ef_DZ6Qx7XhSrvVSntKtmzxjOx0C3Dax0v6kTooqTuj3yE37tMAxULAi6mlTh86aJiHESe0b-cOAFbQHuhQQxcTuBwWiPCGEY43p5YULWIqC0OPyuKC_rwEX8E7BV0CRONyuqSYWgS6jCos--w5ObuX-X9BWkVZ2JeE5iphIDAT4wsDRomf5toKE4vIaB0rYdskWM10puv051iF43cGbhByJ7vNnTb5vO4zqZJ_3EndQwauKTFxt_ujnF5ktRzIolAzZUzAE0wnH8Qq1CKwXMlYoa_I2uQTsj9D8QKPp2UdJQEviYm6sq4Q4GEHgsFwew1KEAu62bxaQFktlmbZZhO1ybt1M_ZEqF1hy4WjYVwIwXmbJI2F13izZktxdelSkwc4FYz5r-4e_S15CJs3OxqODnfJoxDsxgofuUda8-nCvgY7b67e1BuKkl_3vYf_At4mbME |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Human+Posture+Transition-Time+Detection+Based+upon+Inertial+Measurement+Unit+and+Long+Short-Term+Memory+Neural+Networks&rft.jtitle=Biomimetics+%28Basel%2C+Switzerland%29&rft.au=Kuo%2C+Chun-Ting&rft.au=Lin%2C+Jun-Ji&rft.au=Jen%2C+Kuo-Kuang&rft.au=Hsu%2C+Wei-Li&rft.date=2023-10-01&rft.pub=MDPI&rft.eissn=2313-7673&rft.volume=8&rft.issue=6&rft_id=info:doi/10.3390%2Fbiomimetics8060471&rft_id=info%3Apmid%2F37887602&rft.externalDocID=PMC10604330 |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=2313-7673&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=2313-7673&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=2313-7673&client=summon |