AdversarialNAS: Adversarial Neural Architecture Search for GANs

Neural Architecture Search (NAS) that aims to automate the procedure of architecture design has achieved promising results in many computer vision fields. In this paper, we propose an AdversarialNAS method specially tailored for Generative Adversarial Networks (GANs) to search for a superior generat...

Full description

Saved in:
Bibliographic Details
Published inarXiv.org
Main Authors Chen, Gao, Chen, Yunpeng, Liu, Si, Tan, Zhenxiong, Shuicheng Yan
Format Paper
LanguageEnglish
Published Ithaca Cornell University Library, arXiv.org 08.04.2020
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Neural Architecture Search (NAS) that aims to automate the procedure of architecture design has achieved promising results in many computer vision fields. In this paper, we propose an AdversarialNAS method specially tailored for Generative Adversarial Networks (GANs) to search for a superior generative model on the task of unconditional image generation. The AdversarialNAS is the first method that can search the architectures of generator and discriminator simultaneously in a differentiable manner. During searching, the designed adversarial search algorithm does not need to comput any extra metric to evaluate the performance of the searched architecture, and the search paradigm considers the relevance between the two network architectures and improves their mutual balance. Therefore, AdversarialNAS is very efficient and only takes 1 GPU day to search for a superior generative model in the proposed large search space (\(10^{38}\)). Experiments demonstrate the effectiveness and superiority of our method. The discovered generative model sets a new state-of-the-art FID score of \(10.87\) and highly competitive Inception Score of \(8.74\) on CIFAR-10. Its transferability is also proven by setting new state-of-the-art FID score of \(26.98\) and Inception score of \(9.63\) on STL-10. Code is at: \url{https://github.com/chengaopro/AdversarialNAS}.
ISSN:2331-8422