Distribution Fitting for Combating Mode Collapse in Generative Adversarial Networks
Mode collapse is a significant unsolved issue of generative adversarial networks (GANs). In this work, we examine the causes of mode collapse from a novel perspective. Due to the nonuniform sampling in the training process, some subdistributions may be missed when sampling data. As a result, even wh...
Saved in:
Published in | IEEE transaction on neural networks and learning systems Vol. PP; pp. 1 - 12 |
---|---|
Main Authors | , , , , |
Format | Journal Article |
Language | English |
Published |
United States
IEEE
20.09.2023
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Mode collapse is a significant unsolved issue of generative adversarial networks (GANs). In this work, we examine the causes of mode collapse from a novel perspective. Due to the nonuniform sampling in the training process, some subdistributions may be missed when sampling data. As a result, even when the generated distribution differs from the real one, the GAN objective can still achieve the minimum. To address the issue, we propose a global distribution fitting (GDF) method with a penalty term to confine the generated data distribution. When the generated distribution differs from the real one, GDF will make the objective harder to reach the minimal value, while the original global minimum is not changed. To deal with the circumstance when the overall real data is unreachable, we also propose a local distribution fitting (LDF) method. Experiments on several benchmarks demonstrate the effectiveness and competitive performance of GDF and LDF. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 23 |
ISSN: | 2162-237X 2162-2388 |
DOI: | 10.1109/TNNLS.2023.3313600 |