Child body shape measurement using depth cameras and a statistical body shape model

We present a new method for rapidly measuring child body shapes from noisy, incomplete data captured from low-cost depth cameras. This method fits the data using a statistical body shape model (SBSM) to find a complete avatar in the realistic body shape space. The method also predicts a set of stand...

Full description

Saved in:
Bibliographic Details
Published inErgonomics Vol. 58; no. 2; pp. 301 - 309
Main Authors Park, Byoung-Keon, Lumeng, Julie C., Lumeng, Carey N., Ebert, Sheila M., Reed, Matthew P.
Format Journal Article
LanguageEnglish
Published England Taylor & Francis 01.02.2015
Taylor & Francis LLC
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:We present a new method for rapidly measuring child body shapes from noisy, incomplete data captured from low-cost depth cameras. This method fits the data using a statistical body shape model (SBSM) to find a complete avatar in the realistic body shape space. The method also predicts a set of standard anthropometric data for a specific subject without measuring dimensions directly from the fitted model. Since the SBSM was developed using principal component (PC) analysis, we formulate an optimisation problem to fit the model in which the degrees of freedom are defined in PC-score space. The mean unsigned distance between the fitted-model based on depth-camera data and the high-resolution laser scan data was 9.4 mm with a standard deviation (SD) of 5.1 mm. For the torso, the mean distance was 2.9 mm (SD 1.4 mm). The correlations between standard anthropometric dimensions predicted by the SBSM and manually measured dimensions exceeded 0.9. Practitioner Summary: Rapid and robust body shape measurement is beneficial for tracking child body shapes and anthropometric changes. A custom avatar generated by rapidly fitting a statistical body shape model to noisy scan data showed the potential for good accuracy in measuring child body shape.
Bibliography:SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 14
ObjectType-Article-1
ObjectType-Feature-2
content type line 23
ISSN:0014-0139
1366-5847
1366-5847
DOI:10.1080/00140139.2014.965754