Qualitative and Quantitative Spatio-temporal Relations in Daily Living Activity Recognition

For the effective operation of intelligent assistive systems working in real-world human environments, it is important to be able to recognise human activities and their intentions. In this paper we propose a novel approach to activity recognition from visual data. Our approach is based on qualitati...

Full description

Saved in:
Bibliographic Details
Published inComputer Vision -- ACCV 2014 pp. 115 - 130
Main Authors Tayyub, Jawad, Tavanai, Aryana, Gatsoulis, Yiannis, Cohn, Anthony G., Hogg, David C.
Format Book Chapter
LanguageEnglish
Published Cham Springer International Publishing 2015
SeriesLecture Notes in Computer Science
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:For the effective operation of intelligent assistive systems working in real-world human environments, it is important to be able to recognise human activities and their intentions. In this paper we propose a novel approach to activity recognition from visual data. Our approach is based on qualitative and quantitative spatio-temporal features which encode the interactions between human subjects and objects in an efficient manner. Unlike the state of the art, our approach uses significantly fewer assumptions and does not require knowledge about object types, their affordances, or the sub-level activities that high-level activities consist of. We perform an automatic feature selection process which provides the most representative descriptions of the learnt activities. We validated the method using these descriptions on the CAD-120 benchmark dataset, consisting of video sequences showing humans performing daily real-world activities. The method is shown to outperform state of the art benchmarks.
Bibliography:Electronic supplementary materialThe online version of this chapter (doi:10.1007/978-3-319-16814-2_8) contains supplementary material, which is available to authorized users.
ISBN:9783319168135
3319168134
ISSN:0302-9743
1611-3349
DOI:10.1007/978-3-319-16814-2_8