ShaTure: Shape and Texture Deformation for Human Pose and Attribute Transfer
In this paper, we present a novel end-to-end pose transfer framework to transform a source person image to an arbitrary pose with controllable attributes. Due to the spatial misalignment caused by occlusions and multi-viewpoints, maintaining high-quality shape and texture appearance is still a chall...
Saved in:
Published in | IEEE transactions on image processing Vol. 31; pp. 2541 - 2556 |
---|---|
Main Authors | , , , , |
Format | Journal Article |
Language | English |
Published |
United States
IEEE
2022
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | In this paper, we present a novel end-to-end pose transfer framework to transform a source person image to an arbitrary pose with controllable attributes. Due to the spatial misalignment caused by occlusions and multi-viewpoints, maintaining high-quality shape and texture appearance is still a challenging problem for pose-guided person image synthesis. Without considering the deformation of shape and texture, existing solutions on controllable pose transfer still cannot generate high-fidelity texture for the target image. To solve this problem, we design a new image reconstruction decoder - ShaTure which formulates shape and texture in a braiding manner. It can interchange discriminative features in both feature-level space and pixel-level space so that the shape and texture can be mutually fine-tuned. In addition, we develop a new bottleneck module - Adaptive Style Selector (AdaSS) Module which can enhance the multi-scale feature extraction capability by self-recalibration of the feature map through channel-wise attention. Both quantitative and qualitative results show that the proposed framework has superiority compared with the state-of-the-art human pose and attribute transfer methods. Detailed ablation studies report the effectiveness of each contribution, which proves the robustness and efficacy of the proposed framework. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 23 |
ISSN: | 1057-7149 1941-0042 |
DOI: | 10.1109/TIP.2022.3157146 |