VirtualHAR: Virtual Sensing Device and Correlation-Based Learning Approach for Multiwearable Sensing Device-Based Human Activity Recognition

Human activity recognition (HAR) is a prominent research direction in ubiquitous computing. Current state-of-the-art HAR models achieve great success by learning the correlations between the regions of the body parts by using the attached sensing devices for feature extraction. However, explicitly c...

Full description

Saved in:
Bibliographic Details
Published inIEEE internet of things journal Vol. 12; no. 13; pp. 23577 - 23597
Main Authors Ahmad, Nafees, Leung, Ho-Fung, Farnia, Farzan
Format Journal Article
LanguageEnglish
Published Piscataway IEEE 01.07.2025
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
Abstract Human activity recognition (HAR) is a prominent research direction in ubiquitous computing. Current state-of-the-art HAR models achieve great success by learning the correlations between the regions of the body parts by using the attached sensing devices for feature extraction. However, explicitly computing the correlations between whole body parts and whole sub-body parts, which is crucial for extracting discriminatory features for some activities, has not been investigated due to the lack of sensing devices that capture the movements of the whole (sub-)body parts. This study proposes an effective yet lightweight VirtualHAR framework, which automatically models correlations between the whole body parts, whole sub-body parts, and regions based on the concept of virtual sensing devices. The VirtualHAR framework mainly encompasses three modules. The backbone feature extraction (BEF) module extracts the features from a physical sensing device, based on which the Multipurpose Correlations Learning module constructs virtual sensing devices for body parts and sub-body parts and then exploits the explicit correlations between body parts, sub-body parts as well as in regions by using their attached physical sensing devices. Finally, the global aggregation (GA) module learns the GA representation for each physical sensing device by collecting the learned correlated representation from each virtual sensing device and physical sensing device. Comprehensive experiments on benchmark HAR datasets and a resource-constrained device confirm that VirtualHAR outperforms SOTA models in recognition performance and computational complexity. Through thorough quantitative and qualitative analysis, we validate the proposed VirtualHAR framework's effectiveness and efficiency.
AbstractList Human activity recognition (HAR) is a prominent research direction in ubiquitous computing. Current state-of-the-art HAR models achieve great success by learning the correlations between the regions of the body parts by using the attached sensing devices for feature extraction. However, explicitly computing the correlations between whole body parts and whole sub-body parts, which is crucial for extracting discriminatory features for some activities, has not been investigated due to the lack of sensing devices that capture the movements of the whole (sub-)body parts. This study proposes an effective yet lightweight VirtualHAR framework, which automatically models correlations between the whole body parts, whole sub-body parts, and regions based on the concept of virtual sensing devices. The VirtualHAR framework mainly encompasses three modules. The backbone feature extraction (BEF) module extracts the features from a physical sensing device, based on which the Multipurpose Correlations Learning module constructs virtual sensing devices for body parts and sub-body parts and then exploits the explicit correlations between body parts, sub-body parts as well as in regions by using their attached physical sensing devices. Finally, the global aggregation (GA) module learns the GA representation for each physical sensing device by collecting the learned correlated representation from each virtual sensing device and physical sensing device. Comprehensive experiments on benchmark HAR datasets and a resource-constrained device confirm that VirtualHAR outperforms SOTA models in recognition performance and computational complexity. Through thorough quantitative and qualitative analysis, we validate the proposed VirtualHAR framework's effectiveness and efficiency.
Author Farnia, Farzan
Ahmad, Nafees
Leung, Ho-Fung
Author_xml – sequence: 1
  givenname: Nafees
  orcidid: 0000-0001-5650-8602
  surname: Ahmad
  fullname: Ahmad, Nafees
  email: nafees@link.cuhk.edu.hk
  organization: Department of Computer Science and Engineering, The Chinese University of Hong Kong, Ma Liu Shui, Hong Kong SAR, China
– sequence: 2
  givenname: Ho-Fung
  orcidid: 0000-0003-4914-2934
  surname: Leung
  fullname: Leung, Ho-Fung
  email: ho-fung.leung@outlook.com
– sequence: 3
  givenname: Farzan
  orcidid: 0000-0002-6049-9232
  surname: Farnia
  fullname: Farnia, Farzan
  email: farnia@cse.cuhk.edu.hk
  organization: Department of Computer Science and Engineering, The Chinese University of Hong Kong, Ma Liu Shui, Hong Kong SAR, China
BookMark eNpdkN1OwkAQhTcGExF5ABMvNvG6uD_dtusd4g8YDAmit81QprikbHHbYngHH9o29IJ4NSeZ75yZnEvSsblFQq45G3DO9N3rZLYYCCbUQCqlQq3PSFdIEXp-EIjOib4g_aLYMMZqm-I66JLfT-PKCrLxcH5PW03f0RbGrukj7k2CFOyKjnLnMIPS5NZ7gAJXdIrgbEMNdzuXQ_JF09zRtyorzU-9gmWG_4Ja47jagqXDpDR7Ux7oHJN8bU2TfEXOU8gK7LezRz6enxajsTedvUxGw6mXCM1LD3mkg6VKEwGwYgFnbKmFCkFqxkBJX0Rcy2DJV1oyqQTXqRQRQ9SQAiD3ZY_cHnPrx78rLMp4k1fO1idjKYQIQhZxVVP8SCUuLwqHabxzZgvuEHMWN73HTe9x03vc9l57bo4eg4gnvA59n0n5B6QdgQw
CODEN IITJAU
Cites_doi 10.1145/3411818
10.3390/s140610146
10.1145/3448083
10.1145/3038912.3052577
10.1609/aaai.v24i1.7724
10.1109/PERCOM56429.2023.10099138
10.1145/3267242.3267287
10.1109/KSE.2015.43
10.3390/electronics8080881
10.1016/j.future.2020.01.003
10.1016/B978-0-12-809393-1.00009-X
10.1145/3550331
10.1145/3643511
10.3390/s16010115
10.1145/3463508
10.3390/s151229858
10.1145/3447744
10.24963/ijcai.2019/431
10.1145/3090076
10.1145/2370216.2370437
10.3390/s21082814
10.1109/ACCESS.2020.3010715
10.1186/s13673-017-0097-2
10.1109/JSEN.2021.3067690
10.1109/PERCOMW.2018.8480292
10.1007/s40860-021-00147-0
10.1109/GLOBECOM38437.2019.9013934
10.1371/journal.pone.0185670
10.1016/j.patrec.2018.02.010
10.1007/978-3-319-26561-2_6
10.1145/3380999
10.1109/ISWC.2010.5665868
10.1016/j.patrec.2012.12.014
10.1109/INFOCOM.2019.8737500
10.1016/j.procs.2014.07.009
10.1145/3550285
10.1109/PERCOMW.2017.7917643
10.3390/s18020679
10.1016/j.neucom.2015.07.085
10.1109/MPRV.2008.40
ContentType Journal Article
Copyright Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2025
Copyright_xml – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2025
DBID 97E
ESBDL
RIA
RIE
AAYXX
CITATION
7SC
8FD
JQ2
L7M
L~C
L~D
DOI 10.1109/JIOT.2025.3555799
DatabaseName IEEE All-Society Periodicals Package (ASPP) 2005-present
IEEE Open Access Journals
IEEE All-Society Periodicals Package (ASPP) 1998-Present
IEEE Electronic Library (IEL)
CrossRef
Computer and Information Systems Abstracts
Technology Research Database
ProQuest Computer Science Collection
Advanced Technologies Database with Aerospace
Computer and Information Systems Abstracts – Academic
Computer and Information Systems Abstracts Professional
DatabaseTitle CrossRef
Computer and Information Systems Abstracts
Technology Research Database
Computer and Information Systems Abstracts – Academic
Advanced Technologies Database with Aerospace
ProQuest Computer Science Collection
Computer and Information Systems Abstracts Professional
DatabaseTitleList
Computer and Information Systems Abstracts
Database_xml – sequence: 1
  dbid: RIE
  name: IEEE Electronic Library (IEL)
  url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/
  sourceTypes: Publisher
DeliveryMethod fulltext_linktorsrc
Discipline Computer Science
EISSN 2327-4662
EndPage 23597
ExternalDocumentID 10_1109_JIOT_2025_3555799
10974403
Genre orig-research
GroupedDBID 0R~
6IK
97E
AAJGR
AARMG
AASAJ
AAWTH
ABAZT
ABJNI
ABQJQ
ABVLG
AGQYO
AHBIQ
AKJIK
AKQYR
ALMA_UNASSIGNED_HOLDINGS
ATWAV
BEFXN
BFFAM
BGNUA
BKEBE
BPEOZ
EBS
ESBDL
IFIPE
IPLJI
JAVBF
M43
OCL
PQQKQ
RIA
RIE
AAYXX
CITATION
7SC
8FD
JQ2
L7M
L~C
L~D
ID FETCH-LOGICAL-c291t-e1896b5fc2aad06100b9257a3900a534281936b1d93035219f3280ee9afaae143
IEDL.DBID RIE
ISSN 2327-4662
IngestDate Thu Aug 28 18:07:27 EDT 2025
Thu Jul 03 08:42:51 EDT 2025
Wed Aug 27 01:46:11 EDT 2025
IsDoiOpenAccess true
IsOpenAccess true
IsPeerReviewed false
IsScholarly true
Issue 13
Language English
License https://creativecommons.org/licenses/by/4.0/legalcode
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c291t-e1896b5fc2aad06100b9257a3900a534281936b1d93035219f3280ee9afaae143
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ORCID 0000-0003-4914-2934
0000-0002-6049-9232
0000-0001-5650-8602
OpenAccessLink https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/document/10974403
PQID 3222670815
PQPubID 2040421
PageCount 21
ParticipantIDs crossref_primary_10_1109_JIOT_2025_3555799
ieee_primary_10974403
proquest_journals_3222670815
PublicationCentury 2000
PublicationDate 2025-07-01
PublicationDateYYYYMMDD 2025-07-01
PublicationDate_xml – month: 07
  year: 2025
  text: 2025-07-01
  day: 01
PublicationDecade 2020
PublicationPlace Piscataway
PublicationPlace_xml – name: Piscataway
PublicationTitle IEEE internet of things journal
PublicationTitleAbbrev JIoT
PublicationYear 2025
Publisher IEEE
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Publisher_xml – name: IEEE
– name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
References ref13
ref35
ref12
ref34
ref15
Chavarriaga (ref18) 2013; 34
ref37
ref14
ref36
ref30
ref11
ref33
ref10
ref32
ref2
Kose (ref24)
ref1
ref17
ref39
ref16
ref38
Khan (ref25)
Reyes-Ortiz (ref22) 2016; 171
Ordóñez (ref19) 2016; 16
ref23
ref26
ref20
Hammerla (ref31)
ref42
ref41
ref44
ref21
ref43
ref28
ref27
ref29
ref8
ref7
ref9
ref4
ref3
Raschka (ref40) 2018
ref6
ref5
References_xml – ident: ref35
  doi: 10.1145/3411818
– ident: ref11
  doi: 10.3390/s140610146
– start-page: 1
  volume-title: Proc. 3rd Int. Symp. Qual. Life Technol. (isQoLT)
  ident: ref25
  article-title: A feature extraction method for realtime human activity recognition on cell phones
– ident: ref13
  doi: 10.1145/3448083
– ident: ref32
  doi: 10.1145/3038912.3052577
– ident: ref3
  doi: 10.1609/aaai.v24i1.7724
– ident: ref5
  doi: 10.1109/PERCOM56429.2023.10099138
– ident: ref38
  doi: 10.1145/3267242.3267287
– year: 2018
  ident: ref40
  article-title: Model evaluation, model selection, and algorithm selection in machine learning
  publication-title: arXiv:1811.12808
– ident: ref9
  doi: 10.1109/KSE.2015.43
– ident: ref12
  doi: 10.3390/electronics8080881
– ident: ref26
  doi: 10.1016/j.future.2020.01.003
– ident: ref41
  doi: 10.1016/B978-0-12-809393-1.00009-X
– ident: ref14
  doi: 10.1145/3550331
– ident: ref15
  doi: 10.1145/3643511
– volume: 16
  start-page: 115
  issue: 1
  year: 2016
  ident: ref19
  article-title: Deep convolutional and LSTM recurrent neural networks for multimodal wearable activity recognition
  publication-title: Sensors
  doi: 10.3390/s16010115
– start-page: 11
  volume-title: Proc. Workshop Mobile Sensing, Smartphones Wearables Big Data
  ident: ref24
  article-title: Online human activity recognition on smart phones
– ident: ref34
  doi: 10.1145/3463508
– ident: ref10
  doi: 10.3390/s151229858
– ident: ref21
  doi: 10.1145/3447744
– ident: ref37
  doi: 10.24963/ijcai.2019/431
– ident: ref20
  doi: 10.1145/3090076
– ident: ref16
  doi: 10.1145/2370216.2370437
– ident: ref27
  doi: 10.3390/s21082814
– ident: ref39
  doi: 10.1109/ACCESS.2020.3010715
– ident: ref29
  doi: 10.1186/s13673-017-0097-2
– ident: ref44
  doi: 10.1109/JSEN.2021.3067690
– ident: ref1
  doi: 10.1109/PERCOMW.2018.8480292
– ident: ref7
  doi: 10.1007/s40860-021-00147-0
– ident: ref33
  doi: 10.1109/GLOBECOM38437.2019.9013934
– ident: ref4
  doi: 10.1371/journal.pone.0185670
– ident: ref6
  doi: 10.1016/j.patrec.2018.02.010
– ident: ref30
  doi: 10.1007/978-3-319-26561-2_6
– ident: ref42
  doi: 10.1145/3380999
– ident: ref23
  doi: 10.1109/ISWC.2010.5665868
– start-page: 1533
  volume-title: Proc. 25th Int. Joint Conf. Artif. Intell.
  ident: ref31
  article-title: Deep, convolutional, and recurrent models for human activity recognition using wearables
– volume: 34
  start-page: 2033
  issue: 15
  year: 2013
  ident: ref18
  article-title: The opportunity challenge: A benchmark database for on-body sensor-based activity recognition
  publication-title: Pattern Recognit. Lett.
  doi: 10.1016/j.patrec.2012.12.014
– ident: ref43
  doi: 10.1109/INFOCOM.2019.8737500
– ident: ref28
  doi: 10.1016/j.procs.2014.07.009
– ident: ref36
  doi: 10.1145/3550285
– ident: ref2
  doi: 10.1109/PERCOMW.2017.7917643
– ident: ref8
  doi: 10.3390/s18020679
– volume: 171
  start-page: 754
  year: 2016
  ident: ref22
  article-title: Transition-aware human activity recognition using smartphones
  publication-title: Neurocomputing
  doi: 10.1016/j.neucom.2015.07.085
– ident: ref17
  doi: 10.1109/MPRV.2008.40
SSID ssj0001105196
Score 2.3445544
Snippet Human activity recognition (HAR) is a prominent research direction in ubiquitous computing. Current state-of-the-art HAR models achieve great success by...
SourceID proquest
crossref
ieee
SourceType Aggregation Database
Index Database
Publisher
StartPage 23577
SubjectTerms Body parts
Computational modeling
Convolutional neural networks
Correlation
Data mining
Deep learning
Devices
Effectiveness
Feature extraction
Human activity recognition
human activity recognition (HAR)
Human motion
Internet of Things
Learning
Legged locomotion
Manuals
Modules
Performance evaluation
Qualitative analysis
Representations
Soft sensors
ubiquitous and mobile computing
Ubiquitous computing
Virtual sensors
wearable sensing devices
Title VirtualHAR: Virtual Sensing Device and Correlation-Based Learning Approach for Multiwearable Sensing Device-Based Human Activity Recognition
URI https://ieeexplore.ieee.org/document/10974403
https://www.proquest.com/docview/3222670815
Volume 12
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1LSwMxEA7akxfro2K1Sg6ehG2TfXXjrVZLLVihttLbkuxmRYSt1C2Cv8Ef7Uw2S6UieAtsEsJOMvMlM98MIRcJWB2RhJkTyUw5Ppg8J0qVdpirAO26GQ8k8p3vx-Fw5o_mwdyS1Q0XRmttgs90G5vGl58ukhU-lXXQW-r7mNtzG25uJVlr_aDCEY2E1nMJXTuju4cp3ADdoA1GNeia9K5r22OKqfzSwMasDOpkXC2ojCZ5ba8K1U4-N3I1_nvFe2TXAkzaK3fEPtnS-QGpV8UbqD3Lh-Tr6WWJ3JFhb3JFbZs-Yjh7_kxvNCoQKvOU9rF8Rxkw51yDyUupTcn6THs2HzkF4EsNk_cDPiEXa2MiO9A4DGgvKQtW0EkVu7TIG2Q2uJ32h44tzeAkruCFo3kkQhVkiStlCpCAMSXg8EtPMCYDz0f3nBcqngoPE65ykXluxLQWMpNSA0Y7IrV8ketjQkEpMA26JRMJ87nPZeYpV0TdrkyFFtxvkstKaPFbmYEjNjcXJmKUcIwSjq2Em6SBQvjRsfz_TdKq5BzbQ_oeo5Mp7AImCk7-GHZKdnD2Mjy3RWrFcqXPAIQU6txsvm_J9Nqs
linkProvider IEEE
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV3dS8MwEA-iD_ri58Tp1Dz4JHRL-rXGtzkd29QJcxPfStKmIkInc0Pwb_CP9i5NmUwE3wJN2tBL7n7J3e-OkLMErI5IwsyJZKYcH0yeE6VKO8xVgHbdjAcS-c53g7A79vtPwZMlqxsujNbaBJ_pOjaNLz-dJHO8Kmugt9T3MbfnGhj-gBd0rcWVCkc8ElrfJXRu9Hv3IzgDukEdzGrQNAleF9bHlFP5pYONYelskUE5pSKe5LU-n6l68rmUrfHfc94mmxZi0laxJnbIis53yVZZvoHa3bxHvh5fpsge6baGF9S26QMGtOfP9EqjCqEyT2kbC3gUIXPOJRi9lNqkrM-0ZTOSU4C-1HB5P-ARsrGWXmQHGpcBbSVFyQo6LKOXJnmFjDvXo3bXscUZnMQVfOZoHolQBVniSpkCKGBMCdj-0hOMycDz0UHnhYqnwsOUq1xknhsxrYXMpNSA0vbJaj7J9QGhoBaYBu2SiYT53Ocy85QromZTpkIL7lfJeSm0-K3IwRGbswsTMUo4RgnHVsJVUkEh_OhY_P8qqZVyju02fY_RzRQ2ARUFh38MOyXr3dHdbXzbG9wckQ38UhGsWyOrs-lcHwMkmakTsxC_AVkX3fU
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=VirtualHAR%3A+Virtual+Sensing+Device+and+Correlation-Based+Learning+Approach+for+Multiwearable+Sensing+Device-Based+Human+Activity+Recognition&rft.jtitle=IEEE+internet+of+things+journal&rft.au=Ahmad%2C+Nafees&rft.au=Leung%2C+Ho-Fung&rft.au=Farnia%2C+Farzan&rft.date=2025-07-01&rft.issn=2327-4662&rft.eissn=2327-4662&rft.volume=12&rft.issue=13&rft.spage=23577&rft.epage=23597&rft_id=info:doi/10.1109%2FJIOT.2025.3555799&rft.externalDBID=n%2Fa&rft.externalDocID=10_1109_JIOT_2025_3555799
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=2327-4662&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=2327-4662&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=2327-4662&client=summon