ActiveSelfHAR: Incorporating Self-Training Into Active Learning to Improve Cross-Subject Human Activity Recognition
Deep learning (DL)-based human activity recognition (HAR) methods have shown promise in the applications of health Internet of Things (IoT) and wireless body sensor networks (BSNs). However, adapting these methods to new users in real-world scenarios is challenging due to the cross-subject issue. To...
Saved in:
Published in | IEEE internet of things journal Vol. 11; no. 4; pp. 6833 - 6847 |
---|---|
Main Authors | , , , , , |
Format | Journal Article |
Language | English |
Published |
Piscataway
IEEE
15.02.2024
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Abstract | Deep learning (DL)-based human activity recognition (HAR) methods have shown promise in the applications of health Internet of Things (IoT) and wireless body sensor networks (BSNs). However, adapting these methods to new users in real-world scenarios is challenging due to the cross-subject issue. To solve this issue, we propose ActiveSelfHAR, a framework that combines active learning's benefit of sparsely acquiring informative samples with actual labels and self-training's benefit of effectively utilizing unlabeled data to adapt the HAR model to the target domain, i.e., the new users. ActiveSelfHAR consists of several key steps. First, we utilize the model from the source domain to select and label the domain invariant samples, forming a self-training set. Second, we leverage the distribution information of the self-training set to identify and annotate samples located around the class boundaries, forming a core set. Third, we augment the core set by considering the spatiotemporal relationships among the samples in the nonself-training set. Finally, we combine the self-training set and augmented core set to construct a diverse training set in the target domain and fine-tune the HAR model. Through leave-one-subject-out validation on three IMU-based data sets and one EMG-based data set, our method achieves mean HAR accuracies of 95.20%, 82.06%, 89.52%, and 92.82%, respectively. Our method demonstrates similar HAR accuracies to the upper bound, i.e., fine-tuning framework with approximately 1% labeled data of the target data set, while significantly improving data efficiency and time cost. Our work highlights the potential of implementing user-independent HAR methods into health IoT and BSN. |
---|---|
AbstractList | Deep learning (DL)-based human activity recognition (HAR) methods have shown promise in the applications of health Internet of Things (IoT) and wireless body sensor networks (BSNs). However, adapting these methods to new users in real-world scenarios is challenging due to the cross-subject issue. To solve this issue, we propose ActiveSelfHAR, a framework that combines active learning’s benefit of sparsely acquiring informative samples with actual labels and self-training’s benefit of effectively utilizing unlabeled data to adapt the HAR model to the target domain, i.e., the new users. ActiveSelfHAR consists of several key steps. First, we utilize the model from the source domain to select and label the domain invariant samples, forming a self-training set. Second, we leverage the distribution information of the self-training set to identify and annotate samples located around the class boundaries, forming a core set. Third, we augment the core set by considering the spatiotemporal relationships among the samples in the nonself-training set. Finally, we combine the self-training set and augmented core set to construct a diverse training set in the target domain and fine-tune the HAR model. Through leave-one-subject-out validation on three IMU-based data sets and one EMG-based data set, our method achieves mean HAR accuracies of 95.20%, 82.06%, 89.52%, and 92.82%, respectively. Our method demonstrates similar HAR accuracies to the upper bound, i.e., fine-tuning framework with approximately 1% labeled data of the target data set, while significantly improving data efficiency and time cost. Our work highlights the potential of implementing user-independent HAR methods into health IoT and BSN. |
Author | Wei, Baichun Yi, Chunzhi Zhu, Jianfei Zhang, Qi Jiang, Feng Zhu, Haiqi |
Author_xml | – sequence: 1 givenname: Baichun orcidid: 0000-0002-2407-3410 surname: Wei fullname: Wei, Baichun organization: Faculty of Computing, Harbin Institute of Technology, Harbin, China – sequence: 2 givenname: Chunzhi orcidid: 0000-0002-4180-1109 surname: Yi fullname: Yi, Chunzhi organization: School of Medicine and Health, Harbin Institute of Technology, Harbin, China – sequence: 3 givenname: Qi orcidid: 0000-0002-9119-9523 surname: Zhang fullname: Zhang, Qi organization: Faculty of Computing, Harbin Institute of Technology, Harbin, China – sequence: 4 givenname: Haiqi orcidid: 0000-0002-5076-2412 surname: Zhu fullname: Zhu, Haiqi organization: Faculty of Computing, Harbin Institute of Technology, Harbin, China – sequence: 5 givenname: Jianfei surname: Zhu fullname: Zhu, Jianfei organization: Faculty of Computing, Harbin Institute of Technology, Harbin, China – sequence: 6 givenname: Feng orcidid: 0000-0001-8342-1211 surname: Jiang fullname: Jiang, Feng email: fjiang@hit.edu.cn organization: Faculty of Computing and the School of Medicine and Health, Harbin Institute of Technology, Harbin, China |
BookMark | eNp9kM1LwzAYxoNMcM79AYKHgufOfDVtvI2hrjIYbPMc0jQZGVsy00zYf29rPQwPnvLm4Xnej98tGDjvNAD3CE4QgvzpvVxuJhhiMiEEUZTBKzDEBOcpZQwPLuobMG6aHYSwjWWIsyFopiraL73WezOfrp6T0ikfjj7IaN026eR0E6R13a900Se9P1loGX7EVioPx-BbbRZ806TrU7XTKibz00G63m7jOVlp5bfORuvdHbg2ct_o8e87Ah-vL5vZPF0s38rZdJEqzGlMVY0qampKDIaFqijMGMpyLnnFas5ygyVUUtK8wJLVpsgo4djkqMJ1IQnODBmBx75vu97nSTdR7PwpuHakwBwTRBgtWOtCvUt16wdtxDHYgwxngaDo8IoOr-jwil-8bSb_k1E2yu622MLa_5t86JNWa30xCdMcZ4R8A4fGijk |
CODEN | IITJAU |
CitedBy_id | crossref_primary_10_1109_JIOT_2024_3394050 crossref_primary_10_1007_s11036_024_02306_y |
Cites_doi | 10.1145/3542820 10.1109/TNSRE.2019.2896269 10.1145/3214269 10.1109/TPAMI.2022.3183112 10.1109/TBME.2003.813539 10.1109/TNSRE.2020.2966749 10.1145/3534589 10.1109/JIOT.2022.3140465 10.1145/3448112 10.1093/comjnl/bxt075 10.1109/TNSRE.2018.2796070 10.1016/j.pmrj.2018.06.013 10.1007/s00779-020-01397-0 10.1016/j.neucom.2019.06.081 10.1007/s00779-022-01688-8 10.1109/JIOT.2020.3033430 10.1109/ISWC.2012.13 10.1109/CVPR.2018.00392 10.1109/JSEN.2023.3267490 10.1109/ACCESS.2020.2979074 10.1109/TMC.2021.3071434 10.3390/s22041476 10.3389/fnbot.2021.704226 10.3390/s19010057 10.1038/s41587-019-0045-y 10.1007/s11390-020-9487-4 10.1016/j.jii.2020.100129 10.1109/tits.2023.3239114 10.3389/frobt.2018.00078 10.1016/j.knosys.2023.110867 10.1145/3328932 10.1007/s12652-022-03768-2 10.1016/j.eswa.2023.119679 10.1109/JIOT.2019.2948888 10.1016/j.neucom.2020.10.056 10.1109/JIOT.2019.2911669 10.1016/j.ins.2015.04.003 10.1109/THMS.2020.3039196 10.3389/fnbot.2016.00009 10.1109/wacv56688.2023.00565 10.1109/TMC.2018.2789890 |
ContentType | Journal Article |
Copyright | Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2024 |
Copyright_xml | – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2024 |
DBID | 97E RIA RIE AAYXX CITATION 7SC 8FD JQ2 L7M L~C L~D |
DOI | 10.1109/JIOT.2023.3314150 |
DatabaseName | IEEE Xplore (IEEE) IEEE All-Society Periodicals Package (ASPP) 1998–Present IEEE Electronic Library (IEL) CrossRef Computer and Information Systems Abstracts Technology Research Database ProQuest Computer Science Collection Advanced Technologies Database with Aerospace Computer and Information Systems Abstracts Academic Computer and Information Systems Abstracts Professional |
DatabaseTitle | CrossRef Computer and Information Systems Abstracts Technology Research Database Computer and Information Systems Abstracts – Academic Advanced Technologies Database with Aerospace ProQuest Computer Science Collection Computer and Information Systems Abstracts Professional |
DatabaseTitleList | Computer and Information Systems Abstracts |
Database_xml | – sequence: 1 dbid: RIE name: IEEE Electronic Library (IEL) url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/ sourceTypes: Publisher |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Computer Science |
EISSN | 2327-4662 |
EndPage | 6847 |
ExternalDocumentID | 10_1109_JIOT_2023_3314150 10247253 |
Genre | orig-research |
GrantInformation_xml | – fundername: National Natural Science Foundation of China grantid: 62076080 funderid: 10.13039/501100001809 |
GroupedDBID | 0R~ 6IK 97E AAJGR AARMG AASAJ AAWTH ABAZT ABJNI ABQJQ ABVLG AGQYO AHBIQ AKJIK AKQYR ALMA_UNASSIGNED_HOLDINGS ATWAV BEFXN BFFAM BGNUA BKEBE BPEOZ EBS IFIPE IPLJI JAVBF M43 OCL PQQKQ RIA RIE AAYXX CITATION 7SC 8FD JQ2 L7M L~C L~D |
ID | FETCH-LOGICAL-c294t-cd1b4fd43f208cb40561579a9b6d967f2a0caa4782a6df854392f71b2d8a325f3 |
IEDL.DBID | RIE |
ISSN | 2327-4662 |
IngestDate | Mon Jun 30 14:33:03 EDT 2025 Tue Jul 01 00:37:59 EDT 2025 Thu Apr 24 23:07:30 EDT 2025 Wed Aug 27 02:12:11 EDT 2025 |
IsPeerReviewed | false |
IsScholarly | true |
Issue | 4 |
Language | English |
License | https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html https://doi.org/10.15223/policy-029 https://doi.org/10.15223/policy-037 |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-c294t-cd1b4fd43f208cb40561579a9b6d967f2a0caa4782a6df854392f71b2d8a325f3 |
Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
ORCID | 0000-0002-5076-2412 0000-0002-4180-1109 0000-0002-2407-3410 0000-0001-8342-1211 0000-0002-9119-9523 |
PQID | 2923136486 |
PQPubID | 2040421 |
PageCount | 15 |
ParticipantIDs | proquest_journals_2923136486 crossref_primary_10_1109_JIOT_2023_3314150 ieee_primary_10247253 crossref_citationtrail_10_1109_JIOT_2023_3314150 |
ProviderPackageCode | CITATION AAYXX |
PublicationCentury | 2000 |
PublicationDate | 2024-02-15 |
PublicationDateYYYYMMDD | 2024-02-15 |
PublicationDate_xml | – month: 02 year: 2024 text: 2024-02-15 day: 15 |
PublicationDecade | 2020 |
PublicationPlace | Piscataway |
PublicationPlace_xml | – name: Piscataway |
PublicationTitle | IEEE internet of things journal |
PublicationTitleAbbrev | JIoT |
PublicationYear | 2024 |
Publisher | IEEE The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Publisher_xml | – name: IEEE – name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
References | ref13 ref12 ref34 ref15 ref37 ref14 ref36 ref31 Anguita (ref40); 3 ref30 ref33 ref10 ref32 ref2 ref1 Fallahzadeh (ref11) ref17 ref39 ref16 ref38 ref19 ref18 ref24 ref23 ref26 ref25 ref20 ref42 ref41 ref22 ref44 ref21 ref43 ref28 ref27 ref29 ref8 ref7 ref9 ref4 ref3 ref6 ref5 Sener (ref35) 2017 |
References_xml | – ident: ref18 doi: 10.1145/3542820 – ident: ref29 doi: 10.1109/TNSRE.2019.2896269 – ident: ref23 doi: 10.1145/3214269 – ident: ref25 doi: 10.1109/TPAMI.2022.3183112 – ident: ref42 doi: 10.1109/TBME.2003.813539 – ident: ref30 doi: 10.1109/TNSRE.2020.2966749 – year: 2017 ident: ref35 article-title: Active learning for convolutional neural networks: A core-set approach publication-title: arXiv:1708.00489 – ident: ref19 doi: 10.1145/3534589 – ident: ref28 doi: 10.1109/JIOT.2022.3140465 – ident: ref32 doi: 10.1145/3448112 – ident: ref38 doi: 10.1093/comjnl/bxt075 – ident: ref41 doi: 10.1109/TNSRE.2018.2796070 – ident: ref6 doi: 10.1016/j.pmrj.2018.06.013 – ident: ref9 doi: 10.1007/s00779-020-01397-0 – ident: ref10 doi: 10.1016/j.neucom.2019.06.081 – ident: ref21 doi: 10.1007/s00779-022-01688-8 – ident: ref13 doi: 10.1109/JIOT.2020.3033430 – ident: ref39 doi: 10.1109/ISWC.2012.13 – ident: ref43 doi: 10.1109/CVPR.2018.00392 – ident: ref37 doi: 10.1109/JSEN.2023.3267490 – volume: 3 start-page: 3 volume-title: Proc. Esann ident: ref40 article-title: A public domain dataset for human activity recognition using smartphones – ident: ref12 doi: 10.1109/ACCESS.2020.2979074 – ident: ref17 doi: 10.1109/TMC.2021.3071434 – ident: ref7 doi: 10.3390/s22041476 – ident: ref14 doi: 10.3389/fnbot.2021.704226 – ident: ref16 doi: 10.3390/s19010057 – ident: ref5 doi: 10.1038/s41587-019-0045-y – ident: ref22 doi: 10.1007/s11390-020-9487-4 – ident: ref4 doi: 10.1016/j.jii.2020.100129 – ident: ref3 doi: 10.1109/tits.2023.3239114 – start-page: 293 volume-title: Proc. 8th Int. Conf. Cyber-Phys. Syst. ident: ref11 article-title: Personalization without user interruption: Boosting activity recognition in new subjects using unlabeled data – ident: ref15 doi: 10.3389/frobt.2018.00078 – ident: ref27 doi: 10.1016/j.knosys.2023.110867 – ident: ref36 doi: 10.1145/3328932 – ident: ref24 doi: 10.1007/s12652-022-03768-2 – ident: ref31 doi: 10.1016/j.eswa.2023.119679 – ident: ref1 doi: 10.1109/JIOT.2019.2948888 – ident: ref8 doi: 10.1016/j.neucom.2020.10.056 – ident: ref2 doi: 10.1109/JIOT.2019.2911669 – ident: ref33 doi: 10.1016/j.ins.2015.04.003 – ident: ref20 doi: 10.1109/THMS.2020.3039196 – ident: ref26 doi: 10.3389/fnbot.2016.00009 – ident: ref34 doi: 10.1109/wacv56688.2023.00565 – ident: ref44 doi: 10.1109/TMC.2018.2789890 |
SSID | ssj0001105196 |
Score | 2.3241127 |
Snippet | Deep learning (DL)-based human activity recognition (HAR) methods have shown promise in the applications of health Internet of Things (IoT) and wireless body... |
SourceID | proquest crossref ieee |
SourceType | Aggregation Database Enrichment Source Index Database Publisher |
StartPage | 6833 |
SubjectTerms | Active learning Adaptation models Body area networks Correlation Costs Cross-subject adaptation Data models Datasets Deep learning deep learning (DL) Human activity recognition Internet of medical things Internet of Things Labeling Labels Machine learning semi-supervised learning Training Upper bounds wearable sensors Wireless networks Wireless sensor networks |
Title | ActiveSelfHAR: Incorporating Self-Training Into Active Learning to Improve Cross-Subject Human Activity Recognition |
URI | https://ieeexplore.ieee.org/document/10247253 https://www.proquest.com/docview/2923136486 |
Volume | 11 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV07T8MwELagEwvPIspLHpiQHBI_kpitqkClEkUqRWKLHMdmALWIpgu_nrPj8BSILXLOkaM7-757-A6hEwN6gwkjCCgnRkAoOJGlLQmgV6VzyWPrXQPX43R4x0f34j5cVvd3YYwxPvnMRO7Rx_KruV46VxnscMozKtgqWgXLrbms9eFQSRwaSUPkMonl2ejqZhq59uARYwkoqviL7vHNVH6cwF6tXG6gcbugJpvkMVrWZaRfv9Vq_PeKN9F6AJi430jEFloxs2200TZvwGEv76BF3590t-bJDvuTc3zlClr6osagzLAbJtPQPgLe1XPc0ONQkPUBw1DjkTB44H6XwBnknDrYxwUacoD4eNKmKM1nXXR3eTEdDEnowEA0lbwmukpKbivOLI1zXXJnbohMKlmmlUwzS1WsleKAMlRa2VwAuqE2S0pa5YpRYdku6szmM7OHcM4AW4B1VlFuODdSMaGt1FxJbjPKdA_FLW8KHcqTuy4ZT4U3U2JZOHYWjp1FYGcPnb5PeW5qc_xF3HXs-UTYcKaHDlsJKML2XRTUwV6W8jzd_2XaAVqDr3OXv52IQ9SpX5bmCOBJXR57sXwDZnvhSw |
linkProvider | IEEE |
linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1LT9wwEB7xOMCllJe6hbY-9ITkkPiRxNxWCLTLY5FgkbhFjmP3ULRbleylv56x41AeKuotcsaKoxn7-zwezwB8t4gbXFpJEZw4RaMQVNWupshetSmVSF1wDVxO8tGtOLuTd_GyergLY60NwWc28Y_hLL-Zm4V3leEMZ6Jgki_DKgK_zLrrWn9dKpnnI3k8u8xSdXg2vpomvkB4wnmGUJW-QJ9QTuXNGhyA5XQDJv2QuniSn8mirRPz51W2xv8e80f4ECkmGXY2sQlLdrYFG335BhJn8zY8DMNad2Pv3Wh4fUTGPqVlSGuMcEZ8M53GAhL4rp2TTp7ElKw_CDZ1PglLjv3vUlyFvFuHhJOBThxJPrnug5Tmsx24PT2ZHo9orMFADVOipabJauEawR1LS1MLv-GQhdKqzhuVF47p1GgtkGfovHGlRH7DXJHVrCk1Z9LxXViZzWf2E5CSI7vA_VnDhBXCKs2lccoIrYQrGDcDSHvdVCYmKPd1Mu6rsFFJVeXVWXl1VlGdAzh46vKry87xnvCOV88zwU4zA9jvLaCKE_ihYp748lyU-ed_dPsGa6Pp5UV1MZ6c78E6fkn4aO5M7sNK-3thvyBZaeuvwUQfAUy45JQ |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=ActiveSelfHAR%3A+Incorporating+Self-Training+Into+Active+Learning+to+Improve+Cross-Subject+Human+Activity+Recognition&rft.jtitle=IEEE+internet+of+things+journal&rft.au=Wei%2C+Baichun&rft.au=Yi%2C+Chunzhi&rft.au=Zhang%2C+Qi&rft.au=Zhu%2C+Haiqi&rft.date=2024-02-15&rft.pub=IEEE&rft.eissn=2327-4662&rft.volume=11&rft.issue=4&rft.spage=6833&rft.epage=6847&rft_id=info:doi/10.1109%2FJIOT.2023.3314150&rft.externalDocID=10247253 |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=2327-4662&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=2327-4662&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=2327-4662&client=summon |