Extraction of Feature Points for Non-Uniform Rational B-Splines(NURBS)-Based Modeling of Human Legs
TS107.5; Methods of digital human modeling have been developed and utilized to reflect human shape features. However, most of published works focused on dynamic visualization or fashion design, instead of high-accuracy modeling, which was strongly demanded by medical or rehabilitation scenarios. Pri...
Saved in:
Published in | 东华大学学报(英文版) Vol. 39; no. 4; pp. 299 - 303 |
---|---|
Main Authors | , , |
Format | Journal Article |
Language | English |
Published |
College of Information Science and Technology,Donghua University,Shanghai 201620,China
30.08.2022
Engineering Research Center of Digitized Textile&Fashion Technology,Ministry of Education,Shanghai 201620,China%College of Information Science and Technology,Donghua University,Shanghai 201620,China%College of Textiles,Donghua University,Shanghai 201620,China |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | TS107.5; Methods of digital human modeling have been developed and utilized to reflect human shape features. However, most of published works focused on dynamic visualization or fashion design, instead of high-accuracy modeling, which was strongly demanded by medical or rehabilitation scenarios. Prior to a high-accuracy modeling of human legs based on non-uniform rational B-splines (NURBS), the method of extracting the required quasi-grid network of feature points for human legs is presented in this work. Given the 3D scanned human body, the leg is firstly segmented and put in standardized position. Then re-sampling of the leg is conducted via a set of equidistant cross sections. Through analysis of leg circumferences and circumferential curvature, the characteristic sections of the leg as well as the characteristic points on the sections are then identified according to the human anatomy and shape features. The obtained collection can be arranged to form a grid of data points for knots calculation and high-accuracy shape reconstruction in future work. |
---|---|
ISSN: | 1672-5220 |
DOI: | 10.19884/j.1672-5220.202202339 |