Warning: Full texts from electronic resources are only available from the university network. You are currently outside this network. Please log in to access full texts
Human activity recognition with analysis of angles between skeletal joints using a RGB‐depth sensor
Human activity recognition (HAR) has become effective as a computer vision tool for video surveillance systems. In this paper, a novel biometric system that can detect human activities in 3D space is proposed. In order to implement HAR, joint angles obtained using an RGB‐depth sensor are used as fea...
Saved in:
Published in | ETRI journal Vol. 42; no. 1; pp. 78 - 89 |
---|---|
Main Authors | , , , , , |
Format | Journal Article |
Language | English |
Published |
Electronics and Telecommunications Research Institute (ETRI)
01.02.2020
한국전자통신연구원 |
Subjects | |
Online Access | Get full text |
ISSN | 1225-6463 2233-7326 |
DOI | 10.4218/etrij.2018-0577 |
Cover
Summary: | Human activity recognition (HAR) has become effective as a computer vision tool for video surveillance systems. In this paper, a novel biometric system that can detect human activities in 3D space is proposed. In order to implement HAR, joint angles obtained using an RGB‐depth sensor are used as features. Because HAR is operated in the time domain, angle information is stored using the sliding kernel method. Haar‐wavelet transform (HWT) is applied to preserve the information of the features before reducing the data dimension. Dimension reduction using an averaging algorithm is also applied to decrease the computational cost, which provides faster performance while maintaining high accuracy. Before the classification, a proposed thresholding method with inverse HWT is conducted to extract the final feature set. Finally, the K‐nearest neighbor (k‐NN) algorithm is used to recognize the activity with respect to the given data. The method compares favorably with the results using other machine learning algorithms. |
---|---|
Bibliography: | https://doi.org/10.4218/etrij.2018-0577 |
ISSN: | 1225-6463 2233-7326 |
DOI: | 10.4218/etrij.2018-0577 |