gDNA: Towards Generative Detailed Neural Avatars

To make 3D human avatars widely available, we must be able to generate a variety of 3D virtual humans with varied identities and shapes in arbitrary poses. This task is chal-lenging due to the diversity of clothed body shapes, their complex articulations, and the resulting rich, yet stochas-tic geom...

Full description

Saved in:
Bibliographic Details
Published inProceedings (IEEE Computer Society Conference on Computer Vision and Pattern Recognition. Online) pp. 20395 - 20405
Main Authors Chen, Xu, Jiang, Tianjian, Song, Jie, Yang, Jinlong, Black, Michael J., Geiger, Andreas, Hilliges, Otmar
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.06.2022
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:To make 3D human avatars widely available, we must be able to generate a variety of 3D virtual humans with varied identities and shapes in arbitrary poses. This task is chal-lenging due to the diversity of clothed body shapes, their complex articulations, and the resulting rich, yet stochas-tic geometric detail in clothing. Hence, current methods that represent 3D people do not provide a full generative model of people in clothing. In this paper, we propose a novel method that learns to generate detailed 3D shapes of people in a variety of garments with corresponding skin-ning weights. Specifically, we devise a multi-subject forward skinning module that is learned from only a few posed, unrigged scans per subject. To capture the stochastic nature of high-frequency details in garments, we leverage an adversarial loss formulation that encourages the model to capture the underlying statistics. We provide empirical evi-dence that this leads to realistic generation of local details such as wrinkles. We show that our model is able to gen-erate natural human avatars wearing diverse and detailed clothing. Furthermore, we show that our method can be used on the task of fitting human models to raw scans, out-performing the previous state-of-the-art.
ISSN:1063-6919
DOI:10.1109/CVPR52688.2022.01978