Human activity recognition based on smartphone using fast feature dimensionality reduction technique

Human activity recognition aims to identify the activities carried out by a person. Recognition is possible by using information that is retrieved from numerous physiological signals by attaching sensors to the subject’s body. Lately, sensors like accelerometer and gyroscope are built-in inside the...

Full description

Saved in:
Bibliographic Details
Published inJournal of ambient intelligence and humanized computing Vol. 12; no. 2; pp. 2365 - 2374
Main Authors Mohammed Hashim, B. A., Amutha, R.
Format Journal Article
LanguageEnglish
Published Berlin/Heidelberg Springer Berlin Heidelberg 01.02.2021
Springer Nature B.V
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Human activity recognition aims to identify the activities carried out by a person. Recognition is possible by using information that is retrieved from numerous physiological signals by attaching sensors to the subject’s body. Lately, sensors like accelerometer and gyroscope are built-in inside the Smartphone itself, which makes activity recognition very simple. To make the activity recognition system work properly in smartphone which has power constraint, it is very essential to use an optimization technique which can reduce the number of features used in the dataset with less time consumption. In this paper, we have proposed a dimensionality reduction technique called fast feature dimensionality reduction technique (FFDRT). A dataset (UCI HAR repository) available in the public domain is used in this work. Results from this study shows that the fast feature dimensionality reduction technique applied for the dataset has reduced the number of features from 561 to 66, while maintaining the activity recognition accuracy at 98.72% using random forest classifier and time consumption in dimensionality reduction stage using FFDRT is much below the state of the art techniques.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1868-5137
1868-5145
DOI:10.1007/s12652-020-02351-x