HRDepthNet: Depth Image-Based Marker-Less Tracking of Body Joints

With approaches for the detection of joint positions in color images such as HRNet and OpenPose being available, consideration of corresponding approaches for depth images is limited even though depth images have several advantages over color images like robustness to light variation or color- and t...

Full description

Saved in:
Bibliographic Details
Published inSensors (Basel, Switzerland) Vol. 21; no. 4; p. 1356
Main Authors Büker, Linda Christin, Zuber, Finnja, Hein, Andreas, Fudickar, Sebastian
Format Journal Article
LanguageEnglish
Published Switzerland MDPI 14.02.2021
MDPI AG
Subjects
Online AccessGet full text
ISSN1424-8220
1424-8220
DOI10.3390/s21041356

Cover

Loading…
More Information
Summary:With approaches for the detection of joint positions in color images such as HRNet and OpenPose being available, consideration of corresponding approaches for depth images is limited even though depth images have several advantages over color images like robustness to light variation or color- and texture invariance. Correspondingly, we introduce High- Resolution Depth Net (HRDepthNet)—a machine learning driven approach to detect human joints (body, head, and upper and lower extremities) in purely depth images. HRDepthNet retrains the original HRNet for depth images. Therefore, a dataset is created holding depth (and RGB) images recorded with subjects conducting the timed up and go test—an established geriatric assessment. The images were manually annotated RGB images. The training and evaluation were conducted with this dataset. For accuracy evaluation, detection of body joints was evaluated via COCO’s evaluation metrics and indicated that the resulting depth image-based model achieved better results than the HRNet trained and applied on corresponding RGB images. An additional evaluation of the position errors showed a median deviation of 1.619 cm (x-axis), 2.342 cm (y-axis) and 2.4 cm (z-axis).
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:1424-8220
1424-8220
DOI:10.3390/s21041356