Contact-less indoor activity analysis using first-reflection echolocation

This paper presents an ultrasound echolocation-based approach for human activity recognition in indoor settings. The key novelty of the proposed approach is to perform activity analysis using distance estimated through "first-reflection echolocation". The distance to the nearest obstructin...

Full description

Saved in:
Bibliographic Details
Published inIEEE International Conference on Communications (2003) pp. 1 - 6
Main Authors Biswas, Subir, Harrington, Brandon, Hajiaghajani, Faezeh, Rui Wang
Format Conference Proceeding Journal Article
LanguageEnglish
Published IEEE 01.05.2016
Subjects
Online AccessGet full text
ISSN1938-1883
DOI10.1109/ICC.2016.7510731

Cover

Loading…
More Information
Summary:This paper presents an ultrasound echolocation-based approach for human activity recognition in indoor settings. The key novelty of the proposed approach is to perform activity analysis using distance estimated through "first-reflection echolocation". The distance to the nearest obstructing object is computed using the first reflected ultrasound signal. All subsequent reflected signal components from other distant objects are ignored. This leads to an extremely simple signal (i.e., time-series distance data) analysis approach with very low computational complexity. Especially so, when compared with the existing approaches in literature in which full reflected signal analysis, often with Doppler Shift computation, is performed for activity classification. It is demonstrated that for the goal of isolating workplace sedentary behavior, the proposed approach can differentiate between sitting, standing, and walking (i.e., in-office pacing) with more than 80% accuracy. This was validated with different classifiers applied on data collected from multiple subjects in multiple sessions. Recorded video was used as the ground-truth for training the classifiers.
Bibliography:ObjectType-Article-2
SourceType-Scholarly Journals-1
ObjectType-Conference-1
ObjectType-Feature-3
content type line 23
SourceType-Conference Papers & Proceedings-2
ISSN:1938-1883
DOI:10.1109/ICC.2016.7510731