3D Aided Duet GANs for Multi-View Face Image Synthesis
Multi-view face synthesis from a single image is an ill-posed computer vision problem. It often suffers from appearance distortions if it is not well-defined. Producing photo-realistic and identity preserving multi-view results is still a not well-defined synthesis problem. This paper proposes 3D ai...
Saved in:
Published in | IEEE transactions on information forensics and security Vol. 14; no. 8; pp. 2028 - 2042 |
---|---|
Main Authors | , , , , |
Format | Journal Article |
Language | English |
Published |
New York
IEEE
01.08.2019
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Multi-view face synthesis from a single image is an ill-posed computer vision problem. It often suffers from appearance distortions if it is not well-defined. Producing photo-realistic and identity preserving multi-view results is still a not well-defined synthesis problem. This paper proposes 3D aided duet generative adversarial networks (AD-GAN) to precisely rotate the yaw angle of an input face image to any specified angle. AD-GAN decomposes the challenging synthesis problem into two well-constrained subtasks that correspond to a face normalizer and a face editor. The normalizer first frontalizes an input image, and then the editor rotates the frontalized image to a desired pose guided by a remote code. In the meantime, the face normalizer is designed to estimate a novel dense UV correspondence field, making our model aware of 3D face geometry information. In order to generate photo-realistic local details and accelerate convergence process, the normalizer and the editor are trained in a two-stage manner and regulated by a conditional self-cycle loss and a perceptual loss. Exhaustive experiments on both controlled and uncontrolled environments demonstrate that the proposed method not only improves the visual realism of multi-view synthetic images but also preserves identity information well. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
ISSN: | 1556-6013 1556-6021 |
DOI: | 10.1109/TIFS.2019.2891116 |