Machine learning models for wearable-based human activity recognition: A comparative study
Lately, Human Activity Recognition (HAR) has seen increased interest in healthcare, due to its ability to identify Activities of Daily Living (ADL), which are crucial for supporting independence and quality of life of elderly individuals. Network architectures in combination with wearable devices en...
Saved in:
Published in | Neurocomputing (Amsterdam) Vol. 650; p. 130911 |
---|---|
Main Authors | , , , , , |
Format | Journal Article |
Language | English |
Published |
Elsevier B.V
14.10.2025
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Lately, Human Activity Recognition (HAR) has seen increased interest in healthcare, due to its ability to identify Activities of Daily Living (ADL), which are crucial for supporting independence and quality of life of elderly individuals. Network architectures in combination with wearable devices enable applications, such as remote monitoring, accident prevention, and rehabilitation support. However, real-world implementation remains hindered by challenges such as sensor configuration and placement, model complexity, and the generalizability across different activity types. In this paper, we aim to bridge the gap between HAR research and its design challenges for the practical adoption in healthcare. We systematically evaluate nine models, ranging from traditional machine learning to complex deep learning architectures across two datasets - namely, OPPORTUNITY and CogAge. We focus separately on the classification of state and behavioural activities, which must be carefully selected for different healthcare contexts, requiring different modeling strategies. The assessed models include Support Vector Machines (SVM) with hand-crafted features, straightforward neural networks like Convolutional Neural Networks (CNNs), Gated Recurrent Unit (GRU) and Long Short-Term Memory Networks (LSTMs), and more intricate models such as Transformer Networks, hybrid Convolutional LSTMs with and without attention. Our results demonstrate strong performance in recognizing state activities (94.10 %–96.48 % Average F1-Score). Behavioural activities were best recognized with the GRU model (79.91 % AF1-Score) on OPPORTUNITY and with the SVM with hand-crafted features on CogAge (69.23 % AF1-Score). Beyond insights into model performance, we provide a discussion on key design considerations for wearable-based HAR systems in healthcare. |
---|---|
ISSN: | 0925-2312 |
DOI: | 10.1016/j.neucom.2025.130911 |