Inverting the Generator of a Generative Adversarial Network
Generative adversarial networks (GANs) learn a deep generative model that is able to synthesize novel, high-dimensional data samples. New data samples are synthesized by passing latent samples, drawn from a chosen prior distribution, through the generative model. Once trained, the latent space exhib...
Saved in:
Published in | IEEE transaction on neural networks and learning systems Vol. 30; no. 7; pp. 1967 - 1974 |
---|---|
Main Authors | , |
Format | Journal Article |
Language | English |
Published |
United States
IEEE
01.07.2019
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects | |
Online Access | Get full text |
ISSN | 2162-237X 2162-2388 2162-2388 |
DOI | 10.1109/TNNLS.2018.2875194 |
Cover
Loading…
Abstract | Generative adversarial networks (GANs) learn a deep generative model that is able to synthesize novel, high-dimensional data samples. New data samples are synthesized by passing latent samples, drawn from a chosen prior distribution, through the generative model. Once trained, the latent space exhibits interesting properties that may be useful for downstream tasks such as classification or retrieval. Unfortunately, GANs do not offer an "inverse model," a mapping from data space back to latent space, making it difficult to infer a latent representation for a given data sample. In this paper, we introduce a technique, inversion , to project data samples, specifically images, to the latent space using a pretrained GAN. Using our proposed inversion technique, we are able to identify which attributes of a data set a trained GAN is able to model and quantify GAN performance, based on a reconstruction loss. We demonstrate how our proposed inversion technique may be used to quantitatively compare the performance of various GAN models trained on three image data sets. We provide codes for all of our experiments in the website ( https://github.com/ToniCreswell/InvertingGAN ). |
---|---|
AbstractList | Generative adversarial networks (GANs) learn a deep generative model that is able to synthesize novel, high-dimensional data samples. New data samples are synthesized by passing latent samples, drawn from a chosen prior distribution, through the generative model. Once trained, the latent space exhibits interesting properties that may be useful for downstream tasks such as classification or retrieval. Unfortunately, GANs do not offer an “inverse model,” a mapping from data space back to latent space, making it difficult to infer a latent representation for a given data sample. In this paper, we introduce a technique, inversion , to project data samples, specifically images, to the latent space using a pretrained GAN. Using our proposed inversion technique, we are able to identify which attributes of a data set a trained GAN is able to model and quantify GAN performance, based on a reconstruction loss. We demonstrate how our proposed inversion technique may be used to quantitatively compare the performance of various GAN models trained on three image data sets. We provide codes for all of our experiments in the website ( https://github.com/ToniCreswell/InvertingGAN ). Generative adversarial networks (GANs) learn a deep generative model that is able to synthesize novel, high-dimensional data samples. New data samples are synthesized by passing latent samples, drawn from a chosen prior distribution, through the generative model. Once trained, the latent space exhibits interesting properties that may be useful for downstream tasks such as classification or retrieval. Unfortunately, GANs do not offer an ``inverse model,'' a mapping from data space back to latent space, making it difficult to infer a latent representation for a given data sample. In this paper, we introduce a technique, inversion, to project data samples, specifically images, to the latent space using a pretrained GAN. Using our proposed inversion technique, we are able to identify which attributes of a data set a trained GAN is able to model and quantify GAN performance, based on a reconstruction loss. We demonstrate how our proposed inversion technique may be used to quantitatively compare the performance of various GAN models trained on three image data sets. We provide codes for all of our experiments in the website (https://github.com/ToniCreswell/InvertingGAN).Generative adversarial networks (GANs) learn a deep generative model that is able to synthesize novel, high-dimensional data samples. New data samples are synthesized by passing latent samples, drawn from a chosen prior distribution, through the generative model. Once trained, the latent space exhibits interesting properties that may be useful for downstream tasks such as classification or retrieval. Unfortunately, GANs do not offer an ``inverse model,'' a mapping from data space back to latent space, making it difficult to infer a latent representation for a given data sample. In this paper, we introduce a technique, inversion, to project data samples, specifically images, to the latent space using a pretrained GAN. Using our proposed inversion technique, we are able to identify which attributes of a data set a trained GAN is able to model and quantify GAN performance, based on a reconstruction loss. We demonstrate how our proposed inversion technique may be used to quantitatively compare the performance of various GAN models trained on three image data sets. We provide codes for all of our experiments in the website (https://github.com/ToniCreswell/InvertingGAN). |
Author | Creswell, Antonia Bharath, Anil Anthony |
Author_xml | – sequence: 1 givenname: Antonia orcidid: 0000-0003-1037-9395 surname: Creswell fullname: Creswell, Antonia email: ac2211@ic.ac.uk organization: BICV, Imperial College London, London, U.K – sequence: 2 givenname: Anil Anthony orcidid: 0000-0001-8808-2714 surname: Bharath fullname: Bharath, Anil Anthony email: aab01@ic.ac.uk organization: BICV, Imperial College London, London, U.K |
BackLink | https://www.ncbi.nlm.nih.gov/pubmed/30403640$$D View this record in MEDLINE/PubMed |
BookMark | eNp9kU1LAzEURYMoflT_gIIMuHHTmrxkMgmuRPwolLpQwV3IzLzo6HSiyVTx3xtt68KF2SSBcx6Xd3fIeuc7JGSf0RFjVJ_cTaeT2xFQpkagipxpsUa2gUkYAldq_fddPGyRvRifaTqS5lLoTbLFqaBcCrpNTsfdO4a-6R6z_gmzK-ww2N6HzLvMrr7NO2ZndeKiDY1tsyn2Hz687JINZ9uIe8t7QO4vL-7Or4eTm6vx-dlkWAnJ-mEta3QyBRUpAVQ1CiU0A8d1TR240lKp65LnIGhZaulsXhS2qF1VaAUVVHxAjhdzX4N_m2PszayJFbat7dDPowHGGQieS53Qoz_os5-HLqUzACIxVApI1OGSmpczrM1raGY2fJrVWhKgFkAVfIwBnamaPu3Bd32wTWsYNd8lmJ8SzHcJZllCUuGPupr-r3SwkBpE_BVUDlRpzb8AJ46QTA |
CODEN | ITNNAL |
CitedBy_id | crossref_primary_10_2478_amns_2025_0603 crossref_primary_10_1007_s42979_021_00720_7 crossref_primary_10_1109_ACCESS_2023_3259236 crossref_primary_10_1109_TASLP_2021_3061885 crossref_primary_10_3934_mbe_2023863 crossref_primary_10_7717_peerj_cs_1313 crossref_primary_10_1007_s11263_023_01878_8 crossref_primary_10_1109_OJSP_2020_3045829 crossref_primary_10_1007_s11042_022_13663_9 crossref_primary_10_1007_s11227_024_06280_w crossref_primary_10_1016_j_eswa_2023_120982 crossref_primary_10_1007_s10462_023_10504_5 crossref_primary_10_1109_TIP_2023_3247167 crossref_primary_10_1016_j_cose_2023_103212 crossref_primary_10_4018_IJDCF_288548 crossref_primary_10_1109_JIOT_2020_3024800 crossref_primary_10_1007_s11263_024_02085_9 crossref_primary_10_1007_s41315_024_00353_y crossref_primary_10_1111_cgf_14503 crossref_primary_10_3390_electronics10101216 crossref_primary_10_1016_j_asoc_2024_112201 crossref_primary_10_1016_j_ins_2024_120130 crossref_primary_10_1016_j_patrec_2021_11_026 crossref_primary_10_1007_s00521_022_07890_2 crossref_primary_10_1109_ACCESS_2023_3336401 crossref_primary_10_3390_rs16142569 crossref_primary_10_3103_S0146411621080241 crossref_primary_10_1109_ACCESS_2020_2992850 crossref_primary_10_1016_j_eswa_2024_125182 crossref_primary_10_1109_TNNLS_2020_2969327 crossref_primary_10_1109_TNNLS_2023_3238397 crossref_primary_10_1109_TIP_2021_3065845 crossref_primary_10_1016_j_neunet_2021_10_017 crossref_primary_10_1109_TNNLS_2020_2979800 crossref_primary_10_32604_csse_2023_027139 crossref_primary_10_1007_s00530_023_01255_y crossref_primary_10_1109_TCDS_2022_3182650 crossref_primary_10_1007_s00138_024_01573_9 crossref_primary_10_1016_j_asoc_2024_112677 crossref_primary_10_1145_3544777 crossref_primary_10_1109_ACCESS_2022_3151186 crossref_primary_10_1007_s10489_021_02401_7 crossref_primary_10_1109_TMM_2022_3160360 crossref_primary_10_1109_TPAMI_2021_3115428 crossref_primary_10_1109_TPAMI_2022_3181070 crossref_primary_10_1016_j_ijepes_2025_110627 crossref_primary_10_1049_ipr2_12485 crossref_primary_10_1109_ACCESS_2020_3035674 crossref_primary_10_1109_ACCESS_2023_3244741 crossref_primary_10_3390_app13116487 crossref_primary_10_1007_s00371_023_02810_4 crossref_primary_10_1016_j_ins_2023_119234 crossref_primary_10_1016_j_patcog_2023_109477 crossref_primary_10_1007_s00371_022_02708_7 crossref_primary_10_1016_j_apenergy_2022_120300 crossref_primary_10_1016_j_dsp_2024_104694 crossref_primary_10_1049_bme2_12034 crossref_primary_10_1109_TITS_2022_3183379 crossref_primary_10_1109_TPAMI_2020_3034267 crossref_primary_10_1111_exsy_13618 crossref_primary_10_1007_s11004_019_09832_6 crossref_primary_10_1016_j_prime_2023_100286 crossref_primary_10_2139_ssrn_3987065 crossref_primary_10_1016_j_cag_2023_06_022 crossref_primary_10_1109_TCSVT_2024_3454549 crossref_primary_10_1007_s00371_024_03310_9 crossref_primary_10_1016_j_knosys_2023_111270 crossref_primary_10_1016_j_engappai_2024_109287 crossref_primary_10_1088_1361_6501_ac0744 crossref_primary_10_1109_TFUZZ_2024_3355000 crossref_primary_10_1007_s12559_024_10291_3 crossref_primary_10_1029_2019WR025787 crossref_primary_10_1002_sam_11610 crossref_primary_10_1109_TII_2022_3167663 crossref_primary_10_1145_3476576_3476683 crossref_primary_10_1109_TIP_2021_3089905 crossref_primary_10_1007_s10489_022_03744_5 crossref_primary_10_1029_2021JB021687 crossref_primary_10_1016_j_knosys_2022_110186 crossref_primary_10_1016_j_apenergy_2020_116069 crossref_primary_10_1109_TNNLS_2020_3007790 crossref_primary_10_1109_TRPMS_2024_3391285 crossref_primary_10_3390_cancers14184399 crossref_primary_10_1109_TMM_2021_3065230 crossref_primary_10_1109_TVCG_2022_3178734 crossref_primary_10_1002_widm_1438 crossref_primary_10_1109_TPAMI_2023_3310872 crossref_primary_10_1137_21M1406313 crossref_primary_10_1016_j_jvcir_2022_103566 crossref_primary_10_3390_electronics13153091 crossref_primary_10_1016_j_image_2024_117242 crossref_primary_10_1016_j_cag_2023_09_003 crossref_primary_10_1007_s00500_019_04602_2 crossref_primary_10_1016_j_ins_2024_120974 crossref_primary_10_1016_j_imavis_2022_104517 crossref_primary_10_1145_3527168 crossref_primary_10_1038_s41598_021_03880_x crossref_primary_10_1016_j_mlwa_2025_100621 crossref_primary_10_1007_s11263_020_01310_5 crossref_primary_10_1109_TNNLS_2020_3028042 crossref_primary_10_1007_s44196_023_00187_9 crossref_primary_10_1109_ACCESS_2022_3206771 crossref_primary_10_1145_3450626_3459860 crossref_primary_10_1109_TIP_2022_3140603 crossref_primary_10_1109_JSTSP_2023_3238552 crossref_primary_10_1109_TPS_2023_3268170 crossref_primary_10_1016_j_knosys_2025_113231 crossref_primary_10_1016_j_tics_2021_06_006 crossref_primary_10_1016_j_ins_2024_120307 crossref_primary_10_3390_drones8120740 crossref_primary_10_1007_s11432_022_3679_0 crossref_primary_10_3390_s22249628 crossref_primary_10_1016_j_asoc_2023_110028 crossref_primary_10_3934_ipi_2021060 crossref_primary_10_1016_j_asoc_2024_112107 crossref_primary_10_1007_s11263_024_02301_6 crossref_primary_10_1109_TSP_2020_2977256 crossref_primary_10_1109_TCSS_2024_3447692 |
Cites_doi | 10.1007/978-3-319-46454-1_36 10.1109/CVPR.2015.7298594 10.1109/CVPR.2015.7299155 10.1109/CVPR.2014.32 10.1109/CVPR.2015.7298878 10.1126/science.aab3050 10.1109/CVPR.2018.00917 10.1007/978-3-319-70096-0_22 10.1109/ICCV.2017.629 |
ContentType | Journal Article |
Copyright | Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2019 |
Copyright_xml | – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2019 |
DBID | 97E ESBDL RIA RIE AAYXX CITATION NPM 7QF 7QO 7QP 7QQ 7QR 7SC 7SE 7SP 7SR 7TA 7TB 7TK 7U5 8BQ 8FD F28 FR3 H8D JG9 JQ2 KR7 L7M L~C L~D P64 7X8 |
DOI | 10.1109/TNNLS.2018.2875194 |
DatabaseName | IEEE Xplore (IEEE) IEEE Xplore Open Access Journals IEEE All-Society Periodicals Package (ASPP) 1998–Present IEEE Electronic Library (IEL) CrossRef PubMed Aluminium Industry Abstracts Biotechnology Research Abstracts Calcium & Calcified Tissue Abstracts Ceramic Abstracts Chemoreception Abstracts Computer and Information Systems Abstracts Corrosion Abstracts Electronics & Communications Abstracts Engineered Materials Abstracts Materials Business File Mechanical & Transportation Engineering Abstracts Neurosciences Abstracts Solid State and Superconductivity Abstracts METADEX Technology Research Database ANTE: Abstracts in New Technology & Engineering Engineering Research Database Aerospace Database Materials Research Database ProQuest Computer Science Collection Civil Engineering Abstracts Advanced Technologies Database with Aerospace Computer and Information Systems Abstracts Academic Computer and Information Systems Abstracts Professional Biotechnology and BioEngineering Abstracts MEDLINE - Academic |
DatabaseTitle | CrossRef PubMed Materials Research Database Technology Research Database Computer and Information Systems Abstracts – Academic Mechanical & Transportation Engineering Abstracts ProQuest Computer Science Collection Computer and Information Systems Abstracts Materials Business File Aerospace Database Engineered Materials Abstracts Biotechnology Research Abstracts Chemoreception Abstracts Advanced Technologies Database with Aerospace ANTE: Abstracts in New Technology & Engineering Civil Engineering Abstracts Aluminium Industry Abstracts Electronics & Communications Abstracts Ceramic Abstracts Neurosciences Abstracts METADEX Biotechnology and BioEngineering Abstracts Computer and Information Systems Abstracts Professional Solid State and Superconductivity Abstracts Engineering Research Database Calcium & Calcified Tissue Abstracts Corrosion Abstracts MEDLINE - Academic |
DatabaseTitleList | Materials Research Database PubMed MEDLINE - Academic |
Database_xml | – sequence: 1 dbid: NPM name: PubMed url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed sourceTypes: Index Database – sequence: 2 dbid: RIE name: IEEE Electronic Library (IEL) url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/ sourceTypes: Publisher |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Computer Science |
EISSN | 2162-2388 |
EndPage | 1974 |
ExternalDocumentID | 30403640 10_1109_TNNLS_2018_2875194 8520899 |
Genre | orig-research Journal Article |
GrantInformation_xml | – fundername: Engineering and Physical Sciences Research Council grantid: EP/L504786/1 funderid: 10.13039/501100000266 |
GroupedDBID | 0R~ 4.4 5VS 6IK 97E AAJGR AARMG AASAJ AAWTH ABAZT ABQJQ ABVLG ACIWK ACPRK AENEX AFRAH AGQYO AGSQL AHBIQ AKJIK AKQYR ALMA_UNASSIGNED_HOLDINGS ATWAV BEFXN BFFAM BGNUA BKEBE BPEOZ EBS EJD ESBDL IFIPE IPLJI JAVBF M43 MS~ O9- OCL PQQKQ RIA RIE RNS AAYXX CITATION RIG NPM 7QF 7QO 7QP 7QQ 7QR 7SC 7SE 7SP 7SR 7TA 7TB 7TK 7U5 8BQ 8FD F28 FR3 H8D JG9 JQ2 KR7 L7M L~C L~D P64 7X8 |
ID | FETCH-LOGICAL-c461t-d6def601840062cde484912f39d0f2fba069db35240bb96fa577a7dfc7982c2c3 |
IEDL.DBID | RIE |
ISSN | 2162-237X 2162-2388 |
IngestDate | Fri Jul 11 04:25:43 EDT 2025 Mon Jun 30 06:07:37 EDT 2025 Thu Jan 02 22:35:05 EST 2025 Tue Jul 01 00:27:28 EDT 2025 Thu Apr 24 23:01:25 EDT 2025 Wed Aug 27 05:56:10 EDT 2025 |
IsDoiOpenAccess | true |
IsOpenAccess | true |
IsPeerReviewed | false |
IsScholarly | true |
Issue | 7 |
Language | English |
License | https://creativecommons.org/licenses/by/3.0/legalcode |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-c461t-d6def601840062cde484912f39d0f2fba069db35240bb96fa577a7dfc7982c2c3 |
Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 content type line 23 |
ORCID | 0000-0001-8808-2714 0000-0003-1037-9395 |
OpenAccessLink | https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/document/8520899 |
PMID | 30403640 |
PQID | 2244350642 |
PQPubID | 85436 |
PageCount | 8 |
ParticipantIDs | proquest_miscellaneous_2131243569 pubmed_primary_30403640 proquest_journals_2244350642 crossref_citationtrail_10_1109_TNNLS_2018_2875194 crossref_primary_10_1109_TNNLS_2018_2875194 ieee_primary_8520899 |
ProviderPackageCode | CITATION AAYXX |
PublicationCentury | 2000 |
PublicationDate | 2019-07-01 |
PublicationDateYYYYMMDD | 2019-07-01 |
PublicationDate_xml | – month: 07 year: 2019 text: 2019-07-01 day: 01 |
PublicationDecade | 2010 |
PublicationPlace | United States |
PublicationPlace_xml | – name: United States – name: Piscataway |
PublicationTitle | IEEE transaction on neural networks and learning systems |
PublicationTitleAbbrev | TNNLS |
PublicationTitleAlternate | IEEE Trans Neural Netw Learn Syst |
PublicationYear | 2019 |
Publisher | IEEE The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Publisher_xml | – name: IEEE – name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
References | ref15 im (ref12) 2016 krizhevsky (ref14) 2012 ref17 arjovsky (ref1) 2017 creswell (ref5) 2016 ref18 donahue (ref8) 2016 goodfellow (ref10) 2014 gulrajani (ref11) 2017 srivastava (ref22) 2017 chen (ref4) 2016 ref24 ref23 radford (ref20) 2016 ref26 ref25 ref27 ref7 metz (ref19) 2017 li (ref16) 2017 creswell (ref6) 2018 salimans (ref21) 2016 arjovsky (ref2) 2017 dumoulin (ref9) 2016 karras (ref13) 2018 ballester (ref3) 2016 |
References_xml | – start-page: 2234 year: 2016 ident: ref21 article-title: Improved techniques for training GANs publication-title: Proc Adv Neural Inf Process Syst – ident: ref27 doi: 10.1007/978-3-319-46454-1_36 – start-page: 214 year: 2017 ident: ref2 article-title: Wasserstein generative adversarial networks publication-title: Proc Int Conf Mach Learn – start-page: 2172 year: 2016 ident: ref4 article-title: InfoGAN: Interpretable representation learning by information maximizing generative adversarial nets publication-title: Proc Neural Inf Process Syst – year: 2017 ident: ref1 publication-title: Towards Principled Methods for Training Generative Adversarial Networks – ident: ref23 doi: 10.1109/CVPR.2015.7298594 – start-page: 2672 year: 2014 ident: ref10 article-title: Generative adversarial nets publication-title: Proc Adv Neural Inf Process Syst – ident: ref18 doi: 10.1109/CVPR.2015.7299155 – year: 2018 ident: ref6 publication-title: Inverting The Generator Of A Generative Adversarial Network – ident: ref25 doi: 10.1109/CVPR.2014.32 – year: 2016 ident: ref12 article-title: Generating images with recurrent adversarial networks publication-title: Proc 4th Int Conf Learn Represent (ICLR) Workshop Track – ident: ref7 doi: 10.1109/CVPR.2015.7298878 – year: 2016 ident: ref9 publication-title: Adversarially learned inference – start-page: 1124 year: 2016 ident: ref3 article-title: On the performance of GoogLeNet and AlexNet applied to sketches publication-title: Proc AAAI – ident: ref15 doi: 10.1126/science.aab3050 – ident: ref24 doi: 10.1109/CVPR.2018.00917 – start-page: 5767 year: 2017 ident: ref11 article-title: Improved training of wasserstein GANs publication-title: Proc Adv Neural Inf Process Syst – start-page: 5501 year: 2017 ident: ref16 article-title: ALICE: Towards understanding adversarial learning for joint distribution matching publication-title: Proc Adv Neural Inf Process Syst – year: 2016 ident: ref20 article-title: Unsupervised representation learning with deep convolutional generative adversarial networks publication-title: Proc 4th Int Conf Learn Represent (ICLR) Workshop Track – year: 2018 ident: ref13 article-title: Progressive growing of gans for improved quality, stability, and variation publication-title: Proc Int Conf Learn Represent – year: 2017 ident: ref19 article-title: Unrolled generative adversarial networks publication-title: Proc Int Conf Learn Represent – start-page: 1097 year: 2012 ident: ref14 article-title: ImageNet classification with deep convolutional neural networks publication-title: Proc Adv Neural Inf Process Syst – year: 2016 ident: ref8 publication-title: Adversarial feature learning – start-page: 3308 year: 2017 ident: ref22 article-title: VEEGAN: Reducing mode collapse in GANs using implicit variational learning publication-title: Proc Adv Neural Inf Process Syst – ident: ref17 doi: 10.1007/978-3-319-70096-0_22 – ident: ref26 doi: 10.1109/ICCV.2017.629 – year: 2016 ident: ref5 publication-title: Task specific adversarial cost function |
SSID | ssj0000605649 |
Score | 2.667837 |
Snippet | Generative adversarial networks (GANs) learn a deep generative model that is able to synthesize novel, high-dimensional data samples. New data samples are... |
SourceID | proquest pubmed crossref ieee |
SourceType | Aggregation Database Index Database Enrichment Source Publisher |
StartPage | 1967 |
SubjectTerms | Backpropagation Data models Datasets Energy management feature extraction Gallium nitride Generative adversarial networks Generators image generation Image reconstruction Inversion Learning systems Mapping multilayer neural network pattern recognition Synthesis Training unsupervised learning Websites |
Title | Inverting the Generator of a Generative Adversarial Network |
URI | https://ieeexplore.ieee.org/document/8520899 https://www.ncbi.nlm.nih.gov/pubmed/30403640 https://www.proquest.com/docview/2244350642 https://www.proquest.com/docview/2131243569 |
Volume | 30 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV07T8MwED6VTiwUKI9CQUZig7SO4ziJmBCiqhDtQit1ixI_GEANou3Cr-fsPCQQIDYncRzHPtvf5zvfAVyGmR8brqVHMeGhUFAcczL2qMbbCqUg5_aA82QqxnP-sAgXLbhuzsJorZ3xmR7YpNPlq0Ju7FbZMA6Z1VJtwRYSt_KsVrOfQhGXC4d2mS-Yx4JoUZ-RoclwNp0-PllDrniAFAFRi43Hg0zeauHolyXJxVj5HW66ZWfUgUld4dLa5GWwWecD-fHNl-N__2gXdir8SW5LgdmDll7uQ6eO7UCqod6FG-uAw3oYeCYIEUnpnRr5OSkMyepLnCmJi-i8yqwck2lpU34A89H97G7sVYEWPMmFv_aUUNogM0OyRwWTSvOYJz4zQaKoYSbPqEhUjlCN0zxPhMnCKMoiZWSUxEwyGRxCe1ks9TEQRA9cZZLhPBpy5uuYKqpyibzFqFBFfg_8uq1TWXkht8EwXlPHRmiSuq5KbVelVVf14Kp55630wfFn7q5t5yZn1cQ96NddmlbDdJUifkG4aDlYDy6axzjArNYkW-pig3n8ADFQEAos4qgUhabsWoJOfv7mKWxjzZLSurcP7fX7Rp8hhlnn5054PwEmyOil |
linkProvider | IEEE |
linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1Lj9MwEB6V7gEu210KbKEsXokbpGs7jpNoTwhRdaHNZVuptyjxgwMoWW2bC7-esfOQQIC4OYnjOJ4Z-xvPeAbgbVSwxAqjAoqFAJmCosypJKAGb2vkglK4A86bTK524vM-2o_g_XAWxhjjnc_MwhW9LV_XqnFbZddJxJ2V6hGc4Lofsfa01rCjQhGZS493OZM84GG870_J0PR6m2XrO-fKlSxQSUDc4jLyoC7v7HD0l0XJZ1n5O-D0C89yApu-y62_ybdFcywX6sdv0Rz_95_O4LRDoORDyzLnMDLVU5j02R1IJ-xTuHEhOFyMga8EQSJp41Ojhk5qS4r-EudK4nM6HwrHySRrvcqfwW75aftxFXSpFgIlJDsGWmpjUTdDdY9KrrQRiUgZt2GqqeW2LKhMdYlgTdCyTKUtojguYm1VnCZccRU-h3FVV-YCCOIHoQvFcSaNBGcmoZrqUqHmYnWkYzYD1o91rro45C4dxvfc6yM0zT2pckeqvCPVDN4N79y3UTj-WXvqxnmo2Q3xDOY9SfNOUA85IhgEjE4Lm8HV8BhFzNlNisrUDdZhIaKgMJLYxIuWFYa2ew56-edvvoHHq-1mna9vsy-v4An2Mm19fecwPj405jUimmN56Rn5J3QX6-4 |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Inverting+the+Generator+of+a+Generative+Adversarial+Network&rft.jtitle=IEEE+transaction+on+neural+networks+and+learning+systems&rft.au=Creswell%2C+Antonia&rft.au=Bharath%2C+Anil+Anthony&rft.date=2019-07-01&rft.issn=2162-237X&rft.eissn=2162-2388&rft.volume=30&rft.issue=7&rft.spage=1967&rft.epage=1974&rft_id=info:doi/10.1109%2FTNNLS.2018.2875194&rft.externalDBID=n%2Fa&rft.externalDocID=10_1109_TNNLS_2018_2875194 |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=2162-237X&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=2162-237X&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=2162-237X&client=summon |