Location-Aware Human Activity Recognition

In this paper, we present one of the winning solutions of an international human activity recognition challenge organized by DrivenData in conjunction with the European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases. The objective of the challenge was...

Full description

Saved in:
Bibliographic Details
Published inAdvanced Data Mining and Applications Vol. 10604; pp. 821 - 835
Main Authors Nguyen, Tam T., Fernandez, Daniel, Nguyen, Quy T. K., Bagheri, Ebrahim
Format Book Chapter
LanguageEnglish
Published Switzerland Springer International Publishing AG 2017
Springer International Publishing
SeriesLecture Notes in Computer Science
Online AccessGet full text

Cover

Loading…
More Information
Summary:In this paper, we present one of the winning solutions of an international human activity recognition challenge organized by DrivenData in conjunction with the European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases. The objective of the challenge was to predict activities of daily living and posture or ambulation based on wrist-worn accelerometer, RGB-D camera, and passive environmental sensor data, which was collected from a smart home in the UK. Most of the state of the art research focus on one type of data, e.g., wearable sensor data, for making predictions and overlook the usefulness of user locations for this purpose. In our work, we propose a novel approach that leverages heterogeneous data types as well as user locations for building predictive models. Note that while we do not have actual location information but we build models to predict location using machine learning models and use the predictions in user activity recognition. Compared to the state of the art, our proposed approach is able to achieve a 38% improvement with a Brier score of 0.1346. This means that roughly 9 out of 10 predictions matched the human-labeled descriptions.
ISBN:9783319691787
3319691783
ISSN:0302-9743
1611-3349
DOI:10.1007/978-3-319-69179-4_58