Human Activity Recognition Based on Residual Network and BiLSTM
Due to the wide application of human activity recognition (HAR) in sports and health, a large number of HAR models based on deep learning have been proposed. However, many existing models ignore the effective extraction of spatial and temporal features of human activity data. This paper proposes a d...
Saved in:
Published in | Sensors (Basel, Switzerland) Vol. 22; no. 2; p. 635 |
---|---|
Main Authors | , |
Format | Journal Article |
Language | English |
Published |
Switzerland
MDPI AG
01.01.2022
MDPI |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Abstract | Due to the wide application of human activity recognition (HAR) in sports and health, a large number of HAR models based on deep learning have been proposed. However, many existing models ignore the effective extraction of spatial and temporal features of human activity data. This paper proposes a deep learning model based on residual block and bi-directional LSTM (BiLSTM). The model first extracts spatial features of multidimensional signals of MEMS inertial sensors automatically using the residual block, and then obtains the forward and backward dependencies of feature sequence using BiLSTM. Finally, the obtained features are fed into the Softmax layer to complete the human activity recognition. The optimal parameters of the model are obtained by experiments. A homemade dataset containing six common human activities of sitting, standing, walking, running, going upstairs and going downstairs is developed. The proposed model is evaluated on our dataset and two public datasets, WISDM and PAMAP2. The experimental results show that the proposed model achieves the accuracy of 96.95%, 97.32% and 97.15% on our dataset, WISDM and PAMAP2, respectively. Compared with some existing models, the proposed model has better performance and fewer parameters. |
---|---|
AbstractList | Due to the wide application of human activity recognition (HAR) in sports and health, a large number of HAR models based on deep learning have been proposed. However, many existing models ignore the effective extraction of spatial and temporal features of human activity data. This paper proposes a deep learning model based on residual block and bi-directional LSTM (BiLSTM). The model first extracts spatial features of multidimensional signals of MEMS inertial sensors automatically using the residual block, and then obtains the forward and backward dependencies of feature sequence using BiLSTM. Finally, the obtained features are fed into the Softmax layer to complete the human activity recognition. The optimal parameters of the model are obtained by experiments. A homemade dataset containing six common human activities of sitting, standing, walking, running, going upstairs and going downstairs is developed. The proposed model is evaluated on our dataset and two public datasets, WISDM and PAMAP2. The experimental results show that the proposed model achieves the accuracy of 96.95%, 97.32% and 97.15% on our dataset, WISDM and PAMAP2, respectively. Compared with some existing models, the proposed model has better performance and fewer parameters. Due to the wide application of human activity recognition (HAR) in sports and health, a large number of HAR models based on deep learning have been proposed. However, many existing models ignore the effective extraction of spatial and temporal features of human activity data. This paper proposes a deep learning model based on residual block and bi-directional LSTM (BiLSTM). The model first extracts spatial features of multidimensional signals of MEMS inertial sensors automatically using the residual block, and then obtains the forward and backward dependencies of feature sequence using BiLSTM. Finally, the obtained features are fed into the Softmax layer to complete the human activity recognition. The optimal parameters of the model are obtained by experiments. A homemade dataset containing six common human activities of sitting, standing, walking, running, going upstairs and going downstairs is developed. The proposed model is evaluated on our dataset and two public datasets, WISDM and PAMAP2. The experimental results show that the proposed model achieves the accuracy of 96.95%, 97.32% and 97.15% on our dataset, WISDM and PAMAP2, respectively. Compared with some existing models, the proposed model has better performance and fewer parameters.Due to the wide application of human activity recognition (HAR) in sports and health, a large number of HAR models based on deep learning have been proposed. However, many existing models ignore the effective extraction of spatial and temporal features of human activity data. This paper proposes a deep learning model based on residual block and bi-directional LSTM (BiLSTM). The model first extracts spatial features of multidimensional signals of MEMS inertial sensors automatically using the residual block, and then obtains the forward and backward dependencies of feature sequence using BiLSTM. Finally, the obtained features are fed into the Softmax layer to complete the human activity recognition. The optimal parameters of the model are obtained by experiments. A homemade dataset containing six common human activities of sitting, standing, walking, running, going upstairs and going downstairs is developed. The proposed model is evaluated on our dataset and two public datasets, WISDM and PAMAP2. The experimental results show that the proposed model achieves the accuracy of 96.95%, 97.32% and 97.15% on our dataset, WISDM and PAMAP2, respectively. Compared with some existing models, the proposed model has better performance and fewer parameters. |
Audience | Academic |
Author | Wang, Luping Li, Yong |
AuthorAffiliation | 1 School of Biomedical Engineering, Sun Yat-sen University, Guangzhou 510006, China; liyong67@mail2.sysu.edu.cn 2 School of Electronics and Communication Engineering, Sun Yat-sen University, Guangzhou 510006, China |
AuthorAffiliation_xml | – name: 2 School of Electronics and Communication Engineering, Sun Yat-sen University, Guangzhou 510006, China – name: 1 School of Biomedical Engineering, Sun Yat-sen University, Guangzhou 510006, China; liyong67@mail2.sysu.edu.cn |
Author_xml | – sequence: 1 givenname: Yong surname: Li fullname: Li, Yong – sequence: 2 givenname: Luping surname: Wang fullname: Wang, Luping |
BackLink | https://www.ncbi.nlm.nih.gov/pubmed/35062604$$D View this record in MEDLINE/PubMed |
BookMark | eNptkktvFDEMxyNURB9w4AugkbjAYds8J5kL1bYqtNICUinnyJPHkmUmKZOZon77Zrtl1VYoh1jOz3_bsffRTkzRIfSW4EPGGnyUKcUU10y8QHuEUz5TxbHzyN5F-zmvMKaMMfUK7TKBa1pjvoeOz6ceYjU3Y7gJ42116UxaxjCGFKsTyM5Wxbh0OdgJuuqbG_-m4XcF0VYnYfHj6utr9NJDl92bh_sA_fx8dnV6Plt8_3JxOl_MjMBqnAGDpvFGWBC4kZIq0tScc2JaRnzrVaOwk1ZwCkZ5YxzlIDkvPRlCsMUtO0AXG12bYKWvh9DDcKsTBH3vSMNSwzAG0zktQAIRFFtPLJeMg3MSs0Z5R0Xd1rJofdpoXU9t76xxcRygeyL69CWGX3qZbrSSUhFGi8CHB4Eh_ZlcHnUfsnFdB9GlKWtaU1paFAQX9P0zdJWmIZavWlOECdbcU4cbagmlgRB9KnlNOdb1wZRh-1D885KdNkTJpgS8e9zCtvZ_gy3Axw1ghpTz4PwWIVivl0Zvl6awR89YE0ZYr0CpInT_ibgDFfm_jw |
CitedBy_id | crossref_primary_10_1080_03772063_2024_2402881 crossref_primary_10_1016_j_patcog_2023_110054 crossref_primary_10_32604_cmc_2023_035512 crossref_primary_10_1016_j_iot_2023_100925 crossref_primary_10_46604_peti_2024_13900 crossref_primary_10_1007_s12530_022_09480_y crossref_primary_10_1186_s12885_023_11364_6 crossref_primary_10_1016_j_imavis_2024_105234 crossref_primary_10_3390_s24020681 crossref_primary_10_1109_ACCESS_2024_3493389 crossref_primary_10_1109_JSEN_2024_3450499 crossref_primary_10_1109_JSEN_2024_3422272 crossref_primary_10_1007_s40747_024_01555_4 crossref_primary_10_32604_cmc_2023_040506 crossref_primary_10_1016_j_aiia_2024_04_002 crossref_primary_10_3390_electronics12234772 crossref_primary_10_1142_S021951942250066X crossref_primary_10_1007_s13369_024_08840_x crossref_primary_10_1186_s42408_023_00203_5 crossref_primary_10_3389_frai_2024_1424190 crossref_primary_10_3390_s22103865 crossref_primary_10_3390_s23010006 crossref_primary_10_1016_j_jocs_2024_102377 crossref_primary_10_1371_journal_pone_0290912 crossref_primary_10_3233_AIS_220125 crossref_primary_10_1186_s40538_024_00681_y crossref_primary_10_3390_bios12060393 crossref_primary_10_1109_JBHI_2024_3455426 crossref_primary_10_1007_s11042_023_17289_3 crossref_primary_10_1111_coin_12565 crossref_primary_10_1007_s00521_023_08863_9 crossref_primary_10_1109_ACCESS_2024_3517417 crossref_primary_10_1109_JSEN_2023_3267243 crossref_primary_10_3390_electronics13091739 crossref_primary_10_1007_s10586_024_04998_z crossref_primary_10_3390_mi14112050 crossref_primary_10_3390_s25020311 crossref_primary_10_1109_JSEN_2024_3402314 crossref_primary_10_1016_j_procs_2023_09_011 crossref_primary_10_1109_ACCESS_2023_3282439 crossref_primary_10_3390_a16020077 crossref_primary_10_1007_s42979_024_02794_5 crossref_primary_10_3390_mi14061170 crossref_primary_10_1088_1361_6579_ad0ab8 crossref_primary_10_1109_ACCESS_2024_3468470 crossref_primary_10_1109_ACCESS_2022_3225652 crossref_primary_10_3233_AIS_230174 crossref_primary_10_32604_cmc_2023_040528 crossref_primary_10_3390_app14209170 crossref_primary_10_1109_OJCS_2023_3334528 crossref_primary_10_3390_s22083094 crossref_primary_10_1016_j_bspc_2024_107254 crossref_primary_10_1080_21642583_2024_2328549 crossref_primary_10_3390_s23094319 crossref_primary_10_1016_j_patrec_2024_06_017 crossref_primary_10_1109_JIOT_2022_3195777 crossref_primary_10_3390_s22218109 crossref_primary_10_1109_ACCESS_2023_3248509 crossref_primary_10_48084_etasr_8861 crossref_primary_10_3390_s23239339 crossref_primary_10_1109_JSEN_2023_3242603 crossref_primary_10_1016_j_jiixd_2024_01_003 crossref_primary_10_3390_s24051610 crossref_primary_10_3390_s23104650 crossref_primary_10_2478_jsiot_2024_0003 crossref_primary_10_3390_s23052593 crossref_primary_10_1109_ACCESS_2023_3314492 crossref_primary_10_3390_s22051911 crossref_primary_10_54751_revistafoco_v16n3_112 |
Cites_doi | 10.3390/s21062141 10.1109/JSEN.2020.3045135 10.1109/IJCNN.2019.8851889 10.3390/s20247109 10.1109/JIOT.2020.3033430 10.1109/JBHI.2019.2909688 10.3390/s19173731 10.3390/s20247195 10.1109/ACCESS.2018.2890004 10.1109/ISMS.2016.51 10.1016/j.asoc.2017.09.027 10.3390/electronics10141685 10.1109/ISWC.2012.13 10.3233/JIFS-169699 10.1109/72.279181 10.3390/s21082814 10.1145/2733373.2806333 10.1016/j.inffus.2017.01.004 10.1109/JSEN.2021.3069927 10.1007/s12652-019-01380-5 10.1109/CVPR.2016.90 10.1016/j.asoc.2021.107728 10.1007/s00607-021-00928-8 10.1145/1964897.1964918 10.1109/EUVIP47703.2019.8946180 10.1007/s11036-019-01445-x 10.1109/PerComWorkshops48775.2020.9156264 10.3390/s21041214 10.2196/jmir.2208 10.1155/2018/7316954 10.1109/JIOT.2018.2846359 10.1109/ACCESS.2019.2920969 10.1109/ACCESS.2020.2982225 10.1007/978-3-642-21257-4_36 10.3390/s21051636 10.1007/s00521-021-05916-9 |
ContentType | Journal Article |
Copyright | COPYRIGHT 2022 MDPI AG 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. 2022 by the authors. 2022 |
Copyright_xml | – notice: COPYRIGHT 2022 MDPI AG – notice: 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. – notice: 2022 by the authors. 2022 |
DBID | AAYXX CITATION CGR CUY CVF ECM EIF NPM 3V. 7X7 7XB 88E 8FI 8FJ 8FK ABUWG AFKRA AZQEC BENPR CCPQU DWQXO FYUFA GHDGH K9. M0S M1P PHGZM PHGZT PIMPY PJZUB PKEHL PPXIY PQEST PQQKQ PQUKI PRINS 7X8 5PM DOA |
DOI | 10.3390/s22020635 |
DatabaseName | CrossRef Medline MEDLINE MEDLINE (Ovid) MEDLINE MEDLINE PubMed ProQuest Central (Corporate) Health & Medical Collection ProQuest Central (purchase pre-March 2016) Medical Database (Alumni Edition) Hospital Premium Collection Hospital Premium Collection (Alumni Edition) ProQuest Central (Alumni) (purchase pre-March 2016) ProQuest Central (Alumni) ProQuest Central UK/Ireland ProQuest Central Essentials ProQuest Central (New) ProQuest One Community College ProQuest Central Korea Health Research Premium Collection Health Research Premium Collection (Alumni) ProQuest Health & Medical Complete (Alumni) ProQuest Health & Medical Collection PML(ProQuest Medical Library) ProQuest Central Premium ProQuest One Academic Publicly Available Content Database ProQuest Health & Medical Research Collection ProQuest One Academic Middle East (New) ProQuest One Health & Nursing ProQuest One Academic Eastern Edition (DO NOT USE) ProQuest One Academic ProQuest One Academic UKI Edition ProQuest Central China MEDLINE - Academic PubMed Central (Full Participant titles) DOAJ Directory of Open Access Journals |
DatabaseTitle | CrossRef MEDLINE Medline Complete MEDLINE with Full Text PubMed MEDLINE (Ovid) Publicly Available Content Database ProQuest One Academic Middle East (New) ProQuest Central Essentials ProQuest Health & Medical Complete (Alumni) ProQuest Central (Alumni Edition) ProQuest One Community College ProQuest One Health & Nursing ProQuest Central China ProQuest Central ProQuest Health & Medical Research Collection Health Research Premium Collection Health and Medicine Complete (Alumni Edition) ProQuest Central Korea Health & Medical Research Collection ProQuest Central (New) ProQuest Medical Library (Alumni) ProQuest One Academic Eastern Edition ProQuest Hospital Collection Health Research Premium Collection (Alumni) ProQuest Hospital Collection (Alumni) ProQuest Health & Medical Complete ProQuest Medical Library ProQuest One Academic UKI Edition ProQuest One Academic ProQuest One Academic (New) ProQuest Central (Alumni) MEDLINE - Academic |
DatabaseTitleList | CrossRef MEDLINE - Academic Publicly Available Content Database MEDLINE |
Database_xml | – sequence: 1 dbid: DOA name: DOAJ Directory of Open Access Journals url: https://www.doaj.org/ sourceTypes: Open Website – sequence: 2 dbid: NPM name: PubMed url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed sourceTypes: Index Database – sequence: 3 dbid: EIF name: MEDLINE url: https://proxy.k.utb.cz/login?url=https://www.webofscience.com/wos/medline/basic-search sourceTypes: Index Database – sequence: 4 dbid: BENPR name: ProQuest Central url: https://www.proquest.com/central sourceTypes: Aggregation Database |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Engineering |
EISSN | 1424-8220 |
ExternalDocumentID | oai_doaj_org_article_5a7a1520df1d4734aee70398fe256b67 PMC8778132 A781291879 35062604 10_3390_s22020635 |
Genre | Journal Article |
GroupedDBID | --- 123 2WC 53G 5VS 7X7 88E 8FE 8FG 8FI 8FJ AADQD AAHBH AAYXX ABDBF ABUWG ACUHS ADBBV ADMLS AENEX AFKRA AFZYC ALIPV ALMA_UNASSIGNED_HOLDINGS BENPR BPHCQ BVXVI CCPQU CITATION CS3 D1I DU5 E3Z EBD ESX F5P FYUFA GROUPED_DOAJ GX1 HH5 HMCUK HYE IAO ITC KQ8 L6V M1P M48 MODMG M~E OK1 OVT P2P P62 PHGZM PHGZT PIMPY PQQKQ PROAC PSQYO RNS RPM TUS UKHRP XSB ~8M 3V. ABJCF ARAPS CGR CUY CVF ECM EIF HCIFZ KB. M7S NPM PDBOC PMFND 7XB 8FK AZQEC DWQXO K9. PJZUB PKEHL PPXIY PQEST PQUKI PRINS PUEGO 7X8 5PM |
ID | FETCH-LOGICAL-c508t-a3a99fc5da50977281964441cb31fbf8980e7d542ac8fcce24a744206c110d0b3 |
IEDL.DBID | M48 |
ISSN | 1424-8220 |
IngestDate | Wed Aug 27 01:31:45 EDT 2025 Thu Aug 21 18:23:14 EDT 2025 Fri Jul 11 10:00:58 EDT 2025 Sat Aug 23 14:52:42 EDT 2025 Tue Jun 10 21:15:27 EDT 2025 Wed Feb 19 02:26:54 EST 2025 Thu Apr 24 23:03:35 EDT 2025 Tue Jul 01 02:41:42 EDT 2025 |
IsDoiOpenAccess | true |
IsOpenAccess | true |
IsPeerReviewed | true |
IsScholarly | true |
Issue | 2 |
Keywords | inertial measurement unit residual network BiLSTM human activity recognition |
Language | English |
License | https://creativecommons.org/licenses/by/4.0 Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-c508t-a3a99fc5da50977281964441cb31fbf8980e7d542ac8fcce24a744206c110d0b3 |
Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 content type line 23 |
OpenAccessLink | http://journals.scholarsportal.info/openUrl.xqy?doi=10.3390/s22020635 |
PMID | 35062604 |
PQID | 2621353910 |
PQPubID | 2032333 |
ParticipantIDs | doaj_primary_oai_doaj_org_article_5a7a1520df1d4734aee70398fe256b67 pubmedcentral_primary_oai_pubmedcentral_nih_gov_8778132 proquest_miscellaneous_2622281510 proquest_journals_2621353910 gale_infotracacademiconefile_A781291879 pubmed_primary_35062604 crossref_primary_10_3390_s22020635 crossref_citationtrail_10_3390_s22020635 |
ProviderPackageCode | CITATION AAYXX |
PublicationCentury | 2000 |
PublicationDate | 2022-01-01 |
PublicationDateYYYYMMDD | 2022-01-01 |
PublicationDate_xml | – month: 01 year: 2022 text: 2022-01-01 day: 01 |
PublicationDecade | 2020 |
PublicationPlace | Switzerland |
PublicationPlace_xml | – name: Switzerland – name: Basel |
PublicationTitle | Sensors (Basel, Switzerland) |
PublicationTitleAlternate | Sensors (Basel) |
PublicationYear | 2022 |
Publisher | MDPI AG MDPI |
Publisher_xml | – name: MDPI AG – name: MDPI |
References | Gao (ref_39) 2021; 111 Qi (ref_1) 2019; 6 Panwar (ref_16) 2017; 2017 Hawash (ref_28) 2021; 8 ref_35 ref_12 ref_34 Dang (ref_3) 2020; 108 ref_10 Bengio (ref_31) 1994; 5 ref_32 ref_30 Ignatov (ref_15) 2018; 62 Ramanujam (ref_13) 2021; 21 ref_19 ref_18 Huang (ref_17) 2020; 24 Radman (ref_33) 2021; 33 Wang (ref_11) 2017; 37 Wan (ref_41) 2020; 25 Kwapisz (ref_7) 2011; 12 Zhao (ref_23) 2018; 2018 Asghari (ref_2) 2020; 11 Yang (ref_36) 2019; 7 Zhang (ref_8) 2019; 7 ref_25 Wu (ref_9) 2012; 14 ref_24 ref_22 ref_20 Singh (ref_37) 2021; 21 Almaslukh (ref_14) 2018; 35 Shi (ref_38) 2020; 21 ref_29 Xia (ref_40) 2020; 8 ref_27 ref_26 Dua (ref_21) 2021; 103 Zhang (ref_4) 2017; 2017 ref_5 ref_6 |
References_xml | – ident: ref_25 doi: 10.3390/s21062141 – volume: 21 start-page: 8575 year: 2021 ident: ref_37 article-title: Deep ConvLSTM With Self-Attention for Human Activity Decoding Using Wearable Sensors publication-title: IEEE Sens. J. doi: 10.1109/JSEN.2020.3045135 – ident: ref_20 doi: 10.1109/IJCNN.2019.8851889 – ident: ref_35 doi: 10.3390/s20247109 – volume: 108 start-page: 1 year: 2020 ident: ref_3 article-title: Sensor-based and vision-based human activity recognition: A comprehensive survey publication-title: Pattern. Recogn. – volume: 8 start-page: 4969 year: 2021 ident: ref_28 article-title: ST-DeepHAR: Deep Learning Model for Human Activity Recognition in IoHT Applications publication-title: IEEE Internet Things doi: 10.1109/JIOT.2020.3033430 – volume: 24 start-page: 292 year: 2020 ident: ref_17 article-title: TSE-CNN: A Two-Stage End-to-End CNN for Human Activity Recognition publication-title: IEEE J. Biomed. Health doi: 10.1109/JBHI.2019.2909688 – ident: ref_19 doi: 10.3390/s19173731 – ident: ref_26 doi: 10.3390/s20247195 – volume: 7 start-page: 6611 year: 2019 ident: ref_36 article-title: A Wearable Activity Recognition Device Using Air-Pressure and IMU Sensors publication-title: IEEE Access doi: 10.1109/ACCESS.2018.2890004 – ident: ref_12 doi: 10.1109/ISMS.2016.51 – volume: 62 start-page: 915 year: 2018 ident: ref_15 article-title: Real-time human activity recognition from accelerometer data using Convolutional Neural Networks publication-title: Appl. Soft Comput. doi: 10.1016/j.asoc.2017.09.027 – volume: 2017 start-page: 1 year: 2017 ident: ref_4 article-title: A Review on Human Activity Recognition Using Vision-Based Method publication-title: J. Healthc. Eng. – ident: ref_27 doi: 10.3390/electronics10141685 – ident: ref_34 doi: 10.1109/ISWC.2012.13 – volume: 35 start-page: 1609 year: 2018 ident: ref_14 article-title: A robust convolutional neural network for online smartphone-based human activity recognition publication-title: J. Intell. Fuzzy Syst. doi: 10.3233/JIFS-169699 – volume: 5 start-page: 157 year: 1994 ident: ref_31 article-title: Learning long-term dependencies with gradient descent is difficult publication-title: IEEE Trans. Neural Netw. doi: 10.1109/72.279181 – ident: ref_6 doi: 10.3390/s21082814 – ident: ref_18 doi: 10.1145/2733373.2806333 – volume: 37 start-page: 1 year: 2017 ident: ref_11 article-title: Kernel fusion based extreme learning machine for cross-location activity recognition publication-title: Inform. Fusion doi: 10.1016/j.inffus.2017.01.004 – volume: 21 start-page: 13029 year: 2021 ident: ref_13 article-title: Human Activity Recognition With Smartphone and Wearable Sensors Using Deep Learning Techniques: A Review publication-title: IEEE Sens. J. doi: 10.1109/JSEN.2021.3069927 – volume: 11 start-page: 1141 year: 2020 ident: ref_2 article-title: Online human activity recognition employing hierarchical hidden Markov models publication-title: J. Amb. Intel. Hum. Comp. doi: 10.1007/s12652-019-01380-5 – ident: ref_30 doi: 10.1109/CVPR.2016.90 – volume: 2017 start-page: 2438 year: 2017 ident: ref_16 article-title: CNN based approach for activity recognition using a wrist-worn accelerometer publication-title: Annu. Int. Conf. IEEE Eng. Med. Biol. Soc. – volume: 111 start-page: 107728 year: 2021 ident: ref_39 article-title: DanHAR: Dual Attention Network for multimodal human activity recognition using wearable sensors publication-title: Appl. Soft Comput. doi: 10.1016/j.asoc.2021.107728 – volume: 103 start-page: 1461 year: 2021 ident: ref_21 article-title: Multi-input CNN-GRU based human activity recognition using wearable sensors publication-title: Computing doi: 10.1007/s00607-021-00928-8 – volume: 12 start-page: 74 year: 2011 ident: ref_7 article-title: Activity recognition using cell phone accelerometers publication-title: SIGKDD Explor. doi: 10.1145/1964897.1964918 – ident: ref_29 – ident: ref_22 doi: 10.1109/EUVIP47703.2019.8946180 – volume: 25 start-page: 743 year: 2020 ident: ref_41 article-title: Deep Learning Models for Real-time Human Activity Recognition with Smartphones publication-title: Mobile Netw. Appl. doi: 10.1007/s11036-019-01445-x – ident: ref_24 doi: 10.1109/PerComWorkshops48775.2020.9156264 – ident: ref_10 doi: 10.3390/s21041214 – volume: 14 start-page: 1 year: 2012 ident: ref_9 article-title: Classification Accuracies of Physical Activities Using Smartphone Motion Sensors publication-title: J. Med. Internet Res. doi: 10.2196/jmir.2208 – volume: 2018 start-page: 1 year: 2018 ident: ref_23 article-title: Deep Residual Bidir-LSTM for Human Activity Recognition Using Wearable Sensors publication-title: Math. Probl. Eng. doi: 10.1155/2018/7316954 – volume: 6 start-page: 1384 year: 2019 ident: ref_1 article-title: A Hybrid Hierarchical Framework for Gym Physical Activity Recognition and Measurement Using Wearable Sensors publication-title: IEEE Internet Things doi: 10.1109/JIOT.2018.2846359 – volume: 7 start-page: 75213 year: 2019 ident: ref_8 article-title: Human Activity Recognition Based on Motion Sensor Using U-Net publication-title: IEEE Access doi: 10.1109/ACCESS.2019.2920969 – volume: 8 start-page: 56855 year: 2020 ident: ref_40 article-title: LSTM-CNN Architecture for Human Activity Recognition publication-title: IEEE Access doi: 10.1109/ACCESS.2020.2982225 – ident: ref_5 doi: 10.1007/978-3-642-21257-4_36 – ident: ref_32 doi: 10.3390/s21051636 – volume: 21 start-page: 667 year: 2020 ident: ref_38 article-title: Real-Time Human Activity Recognition System Based on Capsule and LoRa publication-title: IEEE Sens. J. – volume: 33 start-page: 12689 year: 2021 ident: ref_33 article-title: BiLSTM regression model for face sketch synthesis using sequential patterns publication-title: Neural. Comput. Appl. doi: 10.1007/s00521-021-05916-9 |
SSID | ssj0023338 |
Score | 2.6304023 |
Snippet | Due to the wide application of human activity recognition (HAR) in sports and health, a large number of HAR models based on deep learning have been proposed.... |
SourceID | doaj pubmedcentral proquest gale pubmed crossref |
SourceType | Open Website Open Access Repository Aggregation Database Index Database Enrichment Source |
StartPage | 635 |
SubjectTerms | Accuracy Analysis BiLSTM Cameras Classification Datasets Deep Learning Human Activities human activity recognition Humans inertial measurement unit Machine learning Neural networks Neural Networks, Computer Older people Physical fitness Rehabilitation residual network Running Sensors Smartphones Support vector machines Time series Walking |
SummonAdditionalLinks | – databaseName: DOAJ Directory of Open Access Journals dbid: DOA link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV1Na9wwEB1KTukhNGmbOF8opZBeTLyyZFmnsBsaltDNIR-Qm5DHEg0Ub8nu_v-MZK9Zk0AvvS3rOchvNH4z9ugNwHdpNXKrilRXjgoUhz4tEXXqvaV0QKB0sYlmdltMH8XNk3zaGPUVesJaeeAWuAtplSWOyWo_qoXKhXWONqkuvSOyrop4jpw4b11MdaVWTpVXqyOUU1F_seBU4xMZywH7RJH-t4_iDS4a9kluEM_1J9jpMkY2ble6Cx9cswcfN3QEP8NlfBXPxtiOgmB3666gecMmRFM1ox93bhEPXrHbtvOb2aZmk-df9w-zL_B4_fPhapp2kxFSJASXqc2t1h5lTYBSAhc-hlFeI0ZY5SNf-VKXmVO1FNxi6REdF1YJQQggsX2dVflX2GrmjTsAVlAE2gKL0nItKiF16bwLGvdCYVY7m8CPNWIGO9nwML3ij6HyIYBrenAT-Nab_m21Mt4zmgTYe4Mgbx3_IKebzunmX05P4Dw4zYQgpMWg7c4S0C0FOSszVpS36DBIPYHjtV9NF50Lwwsexn1QppTAWX-Z4ip8LLGNm6-iDSdcZbDZb7dBv-ZcZqEOFAmowQYZ3NTwSvP8O2p3l4qWlvPD_4HCEWzzcBgjvhA6hq3ly8qdUIq0rE5jNLwCpc8NaQ priority: 102 providerName: Directory of Open Access Journals – databaseName: ProQuest Technology Collection dbid: 8FG link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwfV1Lb9QwELZKe4ED4k2gVAYhwSVq1rFj-1Ttoi4Voj2UVurNcvyASigp3e3_Z8bxho1A3KJ4DuOxx_Pw-BtC3gurHbOyKXUbIEAJLpbKOV3GaMEd4E6EVERzetacXPIvV-IqJ9xWuaxycyamg9r3DnPkh6xh2KIBrNvRza8Su0bh7WpuoXGP7M3A0mBJl1p-HgOuGuKvAU2ohtD-cMUg0geTLCY2KEH1_30gb1mkabXklvlZPiIPs99I58NCPyY7oXtCHmyhCT4lRykhT-duaAhBzze1QX1HF2CsPIWP87BKz6_o2VD_TW3n6eL667eL02fkcnl88emkzP0RSgdyXJe2tlpHJzyIFdw4vBID74bPXFvPYhuVVlWQXnBmnYrOBcat5Bwk4MDm-6qtn5Pdru_CS0Ib0EPbuEZZpnnLhVYhBkS659JVPtiCfNxIzLgMHo49LH4aCCJQuGYUbkHejaQ3A2LGv4gWKPaRAEGu04_-9rvJOmOElRbci8rHmeey5jYEOJ-0igH8tLaRBfmAi2ZQFYEZZ_OLApgSglqZuQTvRWM79YLsb9bVZB1dmT87qiBvx2HQLrwysV3o7xINA7kKpHkxbIOR51pUGA3ygsjJBplMajrSXf9ICN5KAms1e_V_tl6T-wwfW6SEzz7ZXd_ehTfgAq3bg7TPfwNy2QUZ priority: 102 providerName: ProQuest |
Title | Human Activity Recognition Based on Residual Network and BiLSTM |
URI | https://www.ncbi.nlm.nih.gov/pubmed/35062604 https://www.proquest.com/docview/2621353910 https://www.proquest.com/docview/2622281510 https://pubmed.ncbi.nlm.nih.gov/PMC8778132 https://doaj.org/article/5a7a1520df1d4734aee70398fe256b67 |
Volume | 22 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1Lb9NAEB71cYEDorxqaCODkOBicNa7Xu8BVUnVUCESodBIuVnr9S5UqhxIUgn-PTPrh2K1By5RFE-k3dkdzzf7-D6At0Irw7RMI1VYLFCscVFmjIqc0wgHuBHWH6KZztLLBf-yFMs9aDU2Gwdu7i3tSE9qsb758Of33zMM-E9UcWLJ_nHDsILHVCv24RATkiQhgynvNhNYknhBa7rTFWE-jGuCof5fe2nJs_fffUfvJKn-AcqdjDR5DI8aKBmO6rE_gj1bPYGHOwSDT-HMr9GHI1NrRITz9rjQqgrHmL_KEL_M7cbfyApn9ZHwUFdlOL7--v1q-gwWk4ur88uokUyIDLp2G-lEK-WMKNHTiOxolwwBDx-aIhm6wmUqi60sBWfaZM4Yy7iWnKMHDMKAMi6S53BQrSp7DGGKoalTk2aaKV5woTLrLJHfc2ni0uoA3rcey03DJ06yFjc51hXk3LxzbgBvOtNfNYnGfUZjcntnQLzX_ofV-kfehFEutNSIOOLSDUsuE66txVeWypxF6FakMoB3NGg5zRdsjNHNJQPsEvFc5SOJgEaRwnoAJ-245u2sy1nKSAcEIVQAr7vHGHC0i6Iru7r1Ngz9KsjmRT0NujYnIqYCkQcgexOk16n-k-r6pyf1ziQ2LWEv_7sDr-ABo6sYfjnoBA6261t7igBpWwxgXy4lfmaTzwM4HF_Mvs0HfrFh4APjH20lEOo |
linkProvider | Scholars Portal |
linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV3db9MwED-N8QA8IL4JDDAIBC_RUseJkwc0tUDpWNuH0Ul7M45jwySUjLUT4p_ib-TO-VgrEG97q-pTeznf-e7iu_sBvEx0briWaZgXFhMUa1yYGZOHzmkMB4RJrC-imc3TyZH4dJwcb8HvrheGyiq7M9Ef1GVt6B35Lk85QTSgd9s7_RESahTdrnYQGo1aHNhfPzFlW77df4_7-4rz8YfFu0nYogqEBv99FepY57kzSYnMYPBDF0kYE4iBKeKBK1yWZ5GVZSK4NpkzxnKhpRA8Sg16yjIqYvzdK3BVxOjJqTN9_LFP8GLM95rpRbgY7S45x2As9UhyFz7PQwP87QDWPOBmdeaauxvfgpttnMqGjWLdhi1b3YEba9ML78KevwBgQ9MAULDDrhaprtgInWPJ8MOhXfp2LzZv6s2Zrko2Opl-XszuwdGlSO4-bFd1ZR8CS9HudWrSTPNcFCLJM-ssTdYX0kSl1QG86SSmTDusnDAzvitMWki4qhduAC960tNmQse_iEYk9p6Ahmr7L-qzr6q1UZVoqTGciUo3KIWMhbYWz8M8cxbjwiKVAbymTVNk-siM0W0HAz4SDdFSQ4nRUk7w7QHsdPuq2jNhqS40OIDn_TJaM13R6MrW556Go1wTonnQqEHPc5xElH2KAOSGgmw81OZKdfLNTwzPJLIW80f_Z-sZXJssZlM13Z8fPIbrnBo9_MumHdhenZ3bJxh-rYqnXucZfLlsI_sDt5JBWQ |
linkToPdf | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV3db9MwED-NTkLwgPgmMCAgELxETR0njh_Q1LJVG9uqqWzS3ozj2NsklI61E-Jf46_jzvlYKxBve4uSU3Q-n-9-Z5_vAN6lWhqmRRbJwmKAYo2LcmNk5JxGOMBNan0SzcEk2znmX07SkzX43d6FobTK1iZ6Q13ODO2R91nGqEUDere-a9IiDrfGmxc_IuogRSetbTuNWkX27K-fGL7NP-1u4Vy_Z2y8ffR5J2o6DEQGOVlEOtFSOpOWyBgCITpUQnzAB6ZIBq5wucxjK8qUM21yZ4xlXAvOWZwZ9JplXCT431uwLigq6sH6aHtyOO3CvQSjv7qWUZLIuD9nDKFZ5vvKXXtA3yjgb3ew5A9XczWXnN_4PtxrUGs4rNXsAazZ6iHcXapl-Ag2_XFAODR1O4pw2mYmzapwhK6yDPFhauf-8lc4qbPPQ12V4eh8_-vRwWM4vhHZPYFeNavsMwgztAI6M1mumeQFT2VunaU6-1yYuLQ6gI-txJRpSpdTB43vCkMYEq7qhBvA2470oq7X8S-iEYm9I6AS2_7F7PJUNStWpVpoBDdx6QYlFwnX1qJ1lLmziBKLTATwgSZNkSFAZoxu7jPgkKiklhoKxE6SmrkHsNHOq2osxFxd63MAb7rPuLbpwEZXdnblaRjKNSWap7UadDwnaUyxKA9ArCjIyqBWv1TnZ75-eC6QtYQ9_z9br-E2LjC1vzvZewF3GN368DtPG9BbXF7Zl4jFFsWrRulD-HbT6-wPFNxG6w |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Human+Activity+Recognition+Based+on+Residual+Network+and+BiLSTM&rft.jtitle=Sensors+%28Basel%2C+Switzerland%29&rft.au=Li%2C+Yong&rft.au=Wang%2C+Luping&rft.date=2022-01-01&rft.pub=MDPI+AG&rft.issn=1424-8220&rft.eissn=1424-8220&rft.volume=22&rft.issue=2&rft_id=info:doi/10.3390%2Fs22020635&rft.externalDocID=A781291879 |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1424-8220&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1424-8220&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1424-8220&client=summon |