A Lightweight and Explainable Hybrid Deep Learning Model for Wearable Sensor-Based Human Activity Recognition
Human activity recognition (HAR) is critical for rehabilitation and clinical monitoring, but robust recognition using wearable sensors (e.g., sEMG or IMU) remains challenging due to signal noise and variability. We propose X-LiteHAR, a lightweight, explainable hybrid deep learning framework for real...
Saved in:
Published in | IEEE sensors journal Vol. 25; no. 12; pp. 22618 - 22628 |
---|---|
Main Authors | , , |
Format | Journal Article |
Language | English |
Published |
New York
IEEE
15.06.2025
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Human activity recognition (HAR) is critical for rehabilitation and clinical monitoring, but robust recognition using wearable sensors (e.g., sEMG or IMU) remains challenging due to signal noise and variability. We propose X-LiteHAR, a lightweight, explainable hybrid deep learning framework for real-time HAR, combining adaptive EEMD for noise-robust signal enhancement and a multihead CNN-LSTM for spatio-temporal feature learning. The optimized framework demonstrates efficient edge deployment through structured pruning and quantization, achieving 70% model size reduction while maintaining competitive performance, with on-device validation on an Android OnePlus 6T smartphone showing 9 ms inference latency. The model was trained and evaluated independently on two distinct datasets: 1) the UCI sEMG dataset (muscle activity signals) and 2) the IMU-only MHealth dataset (motion signals), demonstrating the architecture's adaptability to different sensor modalities. On the UCI dataset, X-LiteHAR achieved 99.0% accuracy (healthy subjects) and 98.7% (pathological), while on MHealth (IMU-only), it reached 99.2% accuracy. Leveraging explainable AI (XAI), we interpret muscle activation patterns for personalized rehabilitation insights. By unifying signal processing, efficient deep learning, and interpretability, X-LiteHAR advances real-time HAR for clinical and wearable applications. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
ISSN: | 1530-437X 1558-1748 |
DOI: | 10.1109/JSEN.2025.3564045 |