A Comparative Analysis on the Effectiveness of GAN Performance
Generative Adversarial Networks (GANs) have emerged as a potent framework in the discipline of Artificial Intelligence (AI) for generating realistic synthetic data. With the increasing interest and advancements in GANs, there is obligation for a detailed comparative study to comprehend the competenc...
Saved in:
Published in | 2023 14th International Conference on Computing Communication and Networking Technologies (ICCCNT) pp. 1 - 8 |
---|---|
Main Authors | , , , , , |
Format | Conference Proceeding |
Language | English |
Published |
IEEE
06.07.2023
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Generative Adversarial Networks (GANs) have emerged as a potent framework in the discipline of Artificial Intelligence (AI) for generating realistic synthetic data. With the increasing interest and advancements in GANs, there is obligation for a detailed comparative study to comprehend the competencies and vulnerabilities of different GAN variants. This paper sets forth a comprehensive study and comparison of various types of Generative Adversarial Networks (GANs) and their performance in generating high-quality images. GANs have gained popularity in recent years due to their capacity to generate realistic synthetic images. However, the effectiveness of GANs varies depending on the architecture and parameters employed. We have evaluated and compared the performance of different types of GANs, including DCGAN, SRGAN, and CGAN, on benchmark datasets such as CIFAR-10 and MNIST. The evaluation metrics include image quality, standard GAN loss functions and Fréchet inception distance (FID). The results demonstrate that the performance of GANs is highly dependent on the dataset and architecture used, with no single GAN type dominating across all datasets. This comparative study serves as a valuable resource for researchers and practitioners in AI, providing a foundation for selecting the appropriate GAN architecture for specific generative modeling tasks. |
---|---|
ISSN: | 2473-7674 |
DOI: | 10.1109/ICCCNT56998.2023.10307295 |