SCAM! Transferring Humans Between Images with Semantic Cross Attention Modulation
A large body of recent work targets semantically conditioned image generation. Most such methods focus on the narrower task of pose transfer and ignore the more challenging task of subject transfer that consists in not only transferring the pose but also the appearance and background. In this work,...
Saved in:
Published in | Computer Vision - ECCV 2022 Vol. 13674; pp. 713 - 729 |
---|---|
Main Authors | , , |
Format | Book Chapter Conference Proceeding |
Language | English |
Published |
Switzerland
Springer
2022
Springer Nature Switzerland |
Series | Lecture Notes in Computer Science |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | A large body of recent work targets semantically conditioned image generation. Most such methods focus on the narrower task of pose transfer and ignore the more challenging task of subject transfer that consists in not only transferring the pose but also the appearance and background. In this work, we introduce SCAM (Semantic Cross Attention Modulation), a system that encodes rich and diverse information in each semantic region of the image (including foreground and background), thus achieving precise generation with emphasis on fine details. This is enabled by the Semantic Attention Transformer Encoder that extracts multiple latent vectors for each semantic region, and the corresponding generator that exploits these multiple latents by using semantic cross attention modulation. It is trained only using a reconstruction setup, while subject transfer is performed at test time. Our analysis shows that our proposed architecture is successful at encoding the diversity of appearance in each semantic region. Extensive experiments on the iDesigner, CelebAMask-HD and ADE20K datasets show that SCAM outperforms competing approaches; moreover, it sets the new state of the art on subject transfer. |
---|---|
Bibliography: | Supplementary InformationThe online version contains supplementary material available at https://doi.org/10.1007/978-3-031-19781-9_41. |
ISBN: | 9783031197802 3031197801 |
ISSN: | 0302-9743 1611-3349 |
DOI: | 10.1007/978-3-031-19781-9_41 |