Realistic Depth Image Synthesis for 3D Hand Pose Estimation
The training of depth image-based hand pose estimation model typically relies on real-life datasets which are expected to be 1) largescale and cover a diverse range of hand poses and hand shapes, and 2) always come with high-precision annotations. However, existing datasets in reality are rather lim...
Saved in:
Published in | IEEE transactions on multimedia Vol. 26; pp. 1 - 12 |
---|---|
Main Authors | , , , |
Format | Journal Article |
Language | English |
Published |
Piscataway
IEEE
01.01.2024
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | The training of depth image-based hand pose estimation model typically relies on real-life datasets which are expected to be 1) largescale and cover a diverse range of hand poses and hand shapes, and 2) always come with high-precision annotations. However, existing datasets in reality are rather limited in the above regards due to multitude practical constraints, with time and cost being the major concerns. This observation motivates us to propose an alternative approach, where hand pose model is primarily trained with synthesized hand depth images that closely mimicking the characteristic noise patterns of a specific depth camera make under consideration. It is achieved by firstly mapping a Gaussian distributed variable to certain specific non-i.i.d. (independent and identically distributed) depth noise pattern, and then transforming a "vanilla" noise-free synthetic depth image to a realistic-looking image. Extensive empirical experiments demonstrate that our approach is capable of generating camera-specific realistic-looking hand depth images with precise annotations; comparing to entirely relying on annotated real images, a hand pose model with better performance is obtained by using only a small fraction (10%) of annotated real images as well as our synthesized images. |
---|---|
ISSN: | 1520-9210 1941-0077 |
DOI: | 10.1109/TMM.2023.3330522 |