Generative adversarial networks (GANs): Introduction, Taxonomy, Variants, Limitations, and Applications

The growing demand for applications based on Generative Adversarial Networks (GANs) has prompted substantial study and analysis in a variety of fields. GAN models have applications in NLP, architectural design, text-to-image, image-to-image, 3D object production, audio-to-image, and prediction. This...

Full description

Saved in:
Bibliographic Details
Published inMultimedia tools and applications Vol. 83; no. 41; pp. 88811 - 88858
Main Authors Sharma, Preeti, Kumar, Manoj, Sharma, Hitesh Kumar, Biju, Soly Mathew
Format Journal Article
LanguageEnglish
Published New York Springer US 01.12.2024
Springer Nature B.V
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:The growing demand for applications based on Generative Adversarial Networks (GANs) has prompted substantial study and analysis in a variety of fields. GAN models have applications in NLP, architectural design, text-to-image, image-to-image, 3D object production, audio-to-image, and prediction. This technique is an important tool for both production and prediction, notably in identifying falsely created pictures, particularly in the context of face forgeries, to ensure visual integrity and security. GANs are critical in determining visual credibility in social media by identifying and assessing forgeries. As the field progresses, a variety of GAN variations arise, along with the development of diverse assessment techniques for assessing model efficacy and scope. The article provides a complete and exhaustive overview of the most recent advances in GAN model designs, the efficacy and breadth of GAN variations, GAN limits and potential solutions, and the blooming ecosystem of upcoming GAN tool domains. Additionally, it investigates key measures like as Inception Score (IS) and Fréchet Inception Distance (FID) as critical benchmarks for improving GAN performance in contrast to existing approaches.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1573-7721
1380-7501
1573-7721
DOI:10.1007/s11042-024-18767-y