Physique-Based Human Activity Recognition Using Ensemble Learning and Smartphone Sensors

Traditional methods of Human Activity Recognition (HAR) do not take into consideration physical attributes of human subjects such as height, weight, gender etc. Thus, a particular recognition model does not perform consistently for different subjects having diverse physical attributes. In this paper...

Full description

Saved in:
Bibliographic Details
Published inIEEE sensors journal Vol. 21; no. 15; pp. 16852 - 16860
Main Authors Choudhury, Nurul Amin, Moulik, Soumen, Roy, Diptendu Sinha
Format Journal Article
LanguageEnglish
Published New York IEEE 01.08.2021
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Traditional methods of Human Activity Recognition (HAR) do not take into consideration physical attributes of human subjects such as height, weight, gender etc. Thus, a particular recognition model does not perform consistently for different subjects having diverse physical attributes. In this paper, we propose a novel Physique-based HAR method that takes care of this problem and provides better accuracy in comparatively less time. Raw sensor data are acquired from inbuilt accelerometer and gyroscope sensor modules of smartphones. After pre-processing of the collected data physique-based datasets were prepared, based on the similarities of subjects' physique. Analysis is done on both of these physique-based datasets and normal traditional dataset with the help of various machine learning algorithms. The work not only identifies the suitable learning algorithms for HAR but also shows that the proposed physique-based method outperforms the traditional HAR approach for both publicly available dataset and our own generated dataset, with an accuracy of 99.88%. Individual activity-wise accuracy results are also compared with several recent benchmarks to show the efficiency of the proposed Physique-based HAR method.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1530-437X
1558-1748
DOI:10.1109/JSEN.2021.3077563