Personalization without user interruption boosting activity recognition in new subjects using unlabeled data
Activity recognition systems are widely used in monitoring physical and physiological conditions as well as observing the short/long term behavioral patterns for the purpose of improving the health and wellbeing of the users. The major obstacle in widespread use of these systems is the need for coll...
Saved in:
Published in | 2017 ACM IEEE 8th International Conference on Cyber Physical Systems (ICCPS) pp. 293 - 302 |
---|---|
Main Authors | , |
Format | Conference Proceeding |
Language | English |
Published |
New York, NY, USA
ACM
18.04.2017
|
Series | ACM Other Conferences |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Activity recognition systems are widely used in monitoring physical and physiological conditions as well as observing the short/long term behavioral patterns for the purpose of improving the health and wellbeing of the users. The major obstacle in widespread use of these systems is the need for collecting labeled data to train the activity recognition model. While a personalized model outperforms a user-independent model, collecting labels from every single user is burdensome and in some cases impractical. In this paper, we propose an uninformed cross-subject transfer learning algorithm that leverages the cross-user similarities by constructing a network-based feature-level representation of the data in source and target users and perform a best effort community detection to extract the core observations in target data. Our algorithm uses a heuristic classifier-based mapping to assign activity labels to the core observations. Finally, the output of labeling is conditionally fused with the prediction of the source-model to develop a boosted and personalized activity recognition algorithm. Our analysis on real-world data demonstrates the superiority of our approach. Our algorithm achieves over 87% accuracy on average which is 7% higher than the state-of-the art approach. |
---|---|
ISBN: | 9781450349659 145034965X |
DOI: | 10.1145/3055004.3055015 |