FuseGAN: Learning to Fuse Multi-Focus Image via Conditional Generative Adversarial Network

We study the problem of multi-focus image fusion, where the key challenge is detecting the focused regions accurately among multiple partially focused source images. Inspired by the conditional generative adversarial network (cGAN) to image-to-image task, we propose a novel FuseGAN to fulfill the im...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on multimedia Vol. 21; no. 8; pp. 1982 - 1996
Main Authors Guo, Xiaopeng, Nie, Rencan, Cao, Jinde, Zhou, Dongming, Mei, Liye, He, Kangjian
Format Journal Article
LanguageEnglish
Published Piscataway IEEE 01.08.2019
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
Abstract We study the problem of multi-focus image fusion, where the key challenge is detecting the focused regions accurately among multiple partially focused source images. Inspired by the conditional generative adversarial network (cGAN) to image-to-image task, we propose a novel FuseGAN to fulfill the images-to-image for multi-focus image fusion. To satisfy the requirement of dual input-to-one output, the encoder of the generator in FuseGAN is designed as a Siamese network. The least square GAN objective is employed to enhance the training stability of FuseGAN, resulting in an accurate confidence map for focus region detection. Also, we exploit the convolutional conditional random fields technique on the confidence map to reach a refined final decision map for better focus region detection. Moreover, due to the lack of a large-scale standard dataset, we synthesize a large enough multi-focus image dataset based on a public natural image dataset PASCAL VOC 2012, where we utilize a normalized disk point spread function to simulate the defocus and separate the background and foreground in the synthesis for each image. We conduct extensive experiments on two public datasets to verify the effectiveness of the proposed method. Results demonstrate that the proposed method presents accurate decision maps for focus regions in multi-focus images, such that the fused images are superior to 11 recent state-of-the-art algorithms, not only in visual perception, but also in quantitative analysis in terms of five metrics.
AbstractList We study the problem of multi-focus image fusion, where the key challenge is detecting the focused regions accurately among multiple partially focused source images. Inspired by the conditional generative adversarial network (cGAN) to image-to-image task, we propose a novel FuseGAN to fulfill the images-to-image for multi-focus image fusion. To satisfy the requirement of dual input-to-one output, the encoder of the generator in FuseGAN is designed as a Siamese network. The least square GAN objective is employed to enhance the training stability of FuseGAN, resulting in an accurate confidence map for focus region detection. Also, we exploit the convolutional conditional random fields technique on the confidence map to reach a refined final decision map for better focus region detection. Moreover, due to the lack of a large-scale standard dataset, we synthesize a large enough multi-focus image dataset based on a public natural image dataset PASCAL VOC 2012, where we utilize a normalized disk point spread function to simulate the defocus and separate the background and foreground in the synthesis for each image. We conduct extensive experiments on two public datasets to verify the effectiveness of the proposed method. Results demonstrate that the proposed method presents accurate decision maps for focus regions in multi-focus images, such that the fused images are superior to 11 recent state-of-the-art algorithms, not only in visual perception, but also in quantitative analysis in terms of five metrics.
Author Guo, Xiaopeng
Mei, Liye
Cao, Jinde
He, Kangjian
Nie, Rencan
Zhou, Dongming
Author_xml – sequence: 1
  givenname: Xiaopeng
  orcidid: 0000-0003-1111-2035
  surname: Guo
  fullname: Guo, Xiaopeng
  email: xiaopengguo@mail.ynu.edu.cn
  organization: School of Information Science and Engineering, Yunnan University, Kunming, China
– sequence: 2
  givenname: Rencan
  orcidid: 0000-0003-0568-1231
  surname: Nie
  fullname: Nie, Rencan
  email: rcnie@ynu.edu.cn
  organization: School of Information Science and Engineering, Yunnan University, Kunming, China
– sequence: 3
  givenname: Jinde
  orcidid: 0000-0003-3133-7119
  surname: Cao
  fullname: Cao, Jinde
  email: jdcao@seu.edu.cn
  organization: School of Mathematics, Southeast University, Nanjing, China
– sequence: 4
  givenname: Dongming
  orcidid: 0000-0003-0139-9415
  surname: Zhou
  fullname: Zhou, Dongming
  email: zhoudm@ynu.edu.cn
  organization: School of Information Science and Engineering, Yunnan University, Kunming, China
– sequence: 5
  givenname: Liye
  surname: Mei
  fullname: Mei, Liye
  email: liyemei@mail.ynu.edu.cn
  organization: School of Information Science and Engineering, Yunnan University, Kunming, China
– sequence: 6
  givenname: Kangjian
  surname: He
  fullname: He, Kangjian
  email: Hekangjian92@126.com
  organization: School of Information Science and Engineering, Yunnan University, Kunming, China
BookMark eNp9kEFPAjEQhRuDiYjeTbw08bzYdre7W2-ECJIAXvDipSm7U1KELbZdjP_ebiAePHiayct7k3nfNeo1tgGE7igZUkrE42qxGDJCxZCVgjPBLlCfiowmhBRFL-6ckUQwSq7QtfdbQmjGSdFH75PWw3S0fMJzUK4xzQYHizsRL9pdMMnEVq3Hs73aAD4ahce2qU0wtlE7PIUGnArmCHhUH8F55UyUlxC-rPu4QZda7TzcnucAvU2eV-OXZP46nY1H86RKeRESzdJKiKxQNYeMrVmpBRdC1aSMr-tMEKJ5namSFus0VksFkJornWmuBdPA0gF6ON09OPvZgg9ya1sX__OSsZyyXFAqois_uSpnvXegZWWC6ooEp8xOUiI7jjJylB1HeeYYg-RP8ODMXrnv_yL3p4gBgF97mTOelSz9AV6EftA
CODEN ITMUF8
CitedBy_id crossref_primary_10_1016_j_infrared_2022_104383
crossref_primary_10_1155_2021_5439935
crossref_primary_10_1109_ACCESS_2023_3335307
crossref_primary_10_1016_j_cviu_2024_104029
crossref_primary_10_1007_s10489_022_04406_2
crossref_primary_10_1109_TMM_2022_3162474
crossref_primary_10_1016_j_image_2021_116599
crossref_primary_10_1016_j_sigpro_2025_109955
crossref_primary_10_1016_j_imavis_2024_105399
crossref_primary_10_1016_j_bspc_2022_104545
crossref_primary_10_1016_j_patrec_2020_08_002
crossref_primary_10_1016_j_image_2023_117058
crossref_primary_10_1016_j_ijleo_2024_171937
crossref_primary_10_1007_s11042_022_12031_x
crossref_primary_10_1016_j_eswa_2023_121156
crossref_primary_10_1016_j_compeleceng_2025_110236
crossref_primary_10_1109_TMM_2022_3185887
crossref_primary_10_1016_j_inffus_2021_02_012
crossref_primary_10_3390_app132312798
crossref_primary_10_1007_s00521_020_04863_1
crossref_primary_10_1016_j_inffus_2022_09_005
crossref_primary_10_1109_TMM_2020_2985541
crossref_primary_10_3390_jimaging6070060
crossref_primary_10_1016_j_compeleceng_2024_109629
crossref_primary_10_1109_TMM_2021_3089019
crossref_primary_10_1007_s00521_023_08916_z
crossref_primary_10_1016_j_cose_2023_103455
crossref_primary_10_1016_j_compbiomed_2023_106769
crossref_primary_10_1016_j_engappai_2024_107967
crossref_primary_10_7736_JKSPE_022_151
crossref_primary_10_1109_TMM_2022_3194991
crossref_primary_10_1016_j_infrared_2021_103975
crossref_primary_10_1109_TMM_2021_3057493
crossref_primary_10_1016_j_neucom_2020_04_002
crossref_primary_10_1109_TMM_2021_3091859
crossref_primary_10_1109_TCSVT_2022_3190057
crossref_primary_10_1109_ACCESS_2020_2971137
crossref_primary_10_1016_j_inffus_2022_10_017
crossref_primary_10_3390_s24227298
crossref_primary_10_1007_s10489_022_03194_z
crossref_primary_10_1016_j_infrared_2021_103947
crossref_primary_10_3390_app12126281
crossref_primary_10_1109_TIM_2022_3159978
crossref_primary_10_1109_TMM_2022_3169055
crossref_primary_10_1016_j_compeleceng_2024_109299
crossref_primary_10_3390_electronics13142856
crossref_primary_10_2139_ssrn_4001587
crossref_primary_10_1007_s11548_025_03333_0
crossref_primary_10_1109_ACCESS_2019_2963741
crossref_primary_10_1364_OE_470037
crossref_primary_10_1109_ACCESS_2020_3010542
crossref_primary_10_23919_cje_2021_00_084
crossref_primary_10_3390_s22145149
crossref_primary_10_1016_j_displa_2024_102837
crossref_primary_10_1137_20M1334103
crossref_primary_10_3390_rs14102430
crossref_primary_10_1109_TIP_2024_3409940
crossref_primary_10_1049_ipr2_12363
crossref_primary_10_1016_j_ijleo_2022_168914
crossref_primary_10_1016_j_image_2023_116982
crossref_primary_10_1109_TITS_2022_3171433
crossref_primary_10_1109_TMM_2021_3109419
crossref_primary_10_1016_j_eswa_2023_121389
crossref_primary_10_1016_j_eswa_2023_121664
crossref_primary_10_1109_TMM_2022_3203220
crossref_primary_10_1007_s10489_022_03819_3
crossref_primary_10_1117_1_JEI_33_2_023005
crossref_primary_10_1109_TMM_2020_2997184
crossref_primary_10_1016_j_imavis_2023_104814
crossref_primary_10_1109_TIP_2022_3184250
crossref_primary_10_1109_TPAMI_2020_3012548
crossref_primary_10_1016_j_image_2021_116554
crossref_primary_10_1016_j_compbiomed_2023_106959
crossref_primary_10_3390_e24040522
crossref_primary_10_1016_j_infrared_2022_104466
crossref_primary_10_1007_s11042_023_16991_6
crossref_primary_10_1109_TMM_2021_3087034
crossref_primary_10_1109_JSEN_2021_3106063
crossref_primary_10_1049_ipr2_12805
crossref_primary_10_1109_TMM_2021_3064273
crossref_primary_10_1016_j_image_2024_117131
crossref_primary_10_1016_j_inffus_2021_06_008
crossref_primary_10_1109_TGRS_2024_3500036
crossref_primary_10_1587_transinf_2024EDP7046
crossref_primary_10_1016_j_inffus_2022_06_001
crossref_primary_10_1016_j_inffus_2020_08_012
crossref_primary_10_1016_j_eswa_2024_125665
crossref_primary_10_1007_s11831_022_09833_5
crossref_primary_10_1049_ipr2_12383
crossref_primary_10_1016_j_dsp_2024_104592
crossref_primary_10_3390_computers9040098
crossref_primary_10_1109_TIM_2021_3124058
crossref_primary_10_1016_j_inffus_2022_11_014
crossref_primary_10_1007_s11063_020_10215_w
crossref_primary_10_1016_j_inffus_2020_06_013
crossref_primary_10_1016_j_image_2021_116295
crossref_primary_10_1016_j_neucom_2024_128116
crossref_primary_10_1109_ACCESS_2024_3402965
crossref_primary_10_1109_TIM_2020_3022438
crossref_primary_10_1109_TIP_2020_3018261
crossref_primary_10_1016_j_ins_2021_06_083
crossref_primary_10_1049_ipr2_12430
crossref_primary_10_1109_TMM_2024_3521817
crossref_primary_10_3390_electronics9091531
crossref_primary_10_1016_j_inffus_2025_102974
crossref_primary_10_1109_JAS_2022_105815
crossref_primary_10_3390_s20143901
crossref_primary_10_1109_TMM_2021_3096088
crossref_primary_10_1007_s11042_023_15342_9
crossref_primary_10_1177_00405175231179516
crossref_primary_10_1109_TMM_2023_3347099
crossref_primary_10_1109_TCDS_2021_3126330
crossref_primary_10_1109_TCSVT_2021_3064035
crossref_primary_10_1016_j_infrared_2024_105410
crossref_primary_10_1109_TMM_2020_3020695
crossref_primary_10_1016_j_imavis_2024_105410
crossref_primary_10_1155_2020_9821715
crossref_primary_10_1007_s11042_023_16071_9
crossref_primary_10_1016_j_asoc_2024_112114
crossref_primary_10_1016_j_sigpro_2021_108282
crossref_primary_10_1109_TMM_2020_2997127
crossref_primary_10_1007_s11042_021_11278_0
crossref_primary_10_1007_s11760_024_03294_y
crossref_primary_10_1109_TPAMI_2024_3367905
crossref_primary_10_1109_JIOT_2023_3342638
crossref_primary_10_1007_s00371_024_03637_3
crossref_primary_10_1109_TMM_2021_3050053
crossref_primary_10_1016_j_cviu_2021_103228
crossref_primary_10_1016_j_knosys_2024_111658
crossref_primary_10_1371_journal_pone_0302545
crossref_primary_10_1109_TMM_2020_3008053
crossref_primary_10_1109_TMM_2021_3065496
crossref_primary_10_1109_TIM_2023_3282300
crossref_primary_10_1109_TPAMI_2021_3078906
crossref_primary_10_1007_s10462_023_10487_3
crossref_primary_10_1109_TCSVT_2022_3229691
crossref_primary_10_1007_s11042_022_12046_4
crossref_primary_10_1007_s00138_022_01345_3
crossref_primary_10_1109_TGRS_2020_3011429
crossref_primary_10_3390_e24050582
crossref_primary_10_3390_e23101362
crossref_primary_10_1016_j_inffus_2020_08_022
crossref_primary_10_1109_TCI_2021_3063872
crossref_primary_10_1109_TMM_2020_3015015
Cites_doi 10.1016/j.inffus.2006.09.001
10.1016/j.inffus.2016.09.006
10.1016/j.inffus.2016.12.001
10.1109/CVPR.2015.7299064
10.1016/j.inffus.2005.09.006
10.1109/TMI.2004.825627
10.1016/j.inffus.2017.10.007
10.1109/ICCV.2017.304
10.1007/s11263-014-0733-5
10.1109/CVPR.2016.90
10.1109/LSP.2016.2618776
10.1007/s11760-013-0556-9
10.1109/ACCESS.2017.2735019
10.1109/TMM.2017.2760100
10.1007/s10044-011-0235-9
10.1109/TIP.2013.2286328
10.1049/el:20000267
10.1016/j.patrec.2007.01.013
10.1007/s11760-012-0361-x
10.1016/j.sigpro.2012.01.027
10.1109/TCSVT.2018.2821177
10.1109/CVPR.2017.632
10.1016/S1566-2535(01)00038-0
10.1109/TMM.2017.2743988
10.1109/TMM.2013.2244870
10.1109/TIP.2013.2244222
10.1109/TMM.2013.2244871
10.1016/S0167-8655(01)00047-2
10.1109/ICCV.2017.505
10.1109/TPAMI.2011.109
10.1109/CVPR.2016.350
10.1016/j.sigpro.2009.01.012
10.1887/0750304359
10.1109/CVPR.2017.728
10.1016/j.imavis.2007.12.002
10.1016/j.inffus.2006.04.001
10.1109/TMM.2015.2403612
10.1016/j.inffus.2011.07.001
10.1016/j.inffus.2014.10.004
10.1016/j.inffus.2014.05.004
10.1016/j.inffus.2005.04.003
10.1049/el:20081754
10.1016/j.imavis.2007.10.012
ContentType Journal Article
Copyright Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2019
Copyright_xml – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2019
DBID 97E
RIA
RIE
AAYXX
CITATION
7SC
7SP
8FD
JQ2
L7M
L~C
L~D
DOI 10.1109/TMM.2019.2895292
DatabaseName IEEE All-Society Periodicals Package (ASPP) 2005–Present
IEEE All-Society Periodicals Package (ASPP) 1998–Present
IEEE/IET Electronic Library
CrossRef
Computer and Information Systems Abstracts
Electronics & Communications Abstracts
Technology Research Database
ProQuest Computer Science Collection
Advanced Technologies Database with Aerospace
Computer and Information Systems Abstracts – Academic
Computer and Information Systems Abstracts Professional
DatabaseTitle CrossRef
Technology Research Database
Computer and Information Systems Abstracts – Academic
Electronics & Communications Abstracts
ProQuest Computer Science Collection
Computer and Information Systems Abstracts
Advanced Technologies Database with Aerospace
Computer and Information Systems Abstracts Professional
DatabaseTitleList Technology Research Database

Database_xml – sequence: 1
  dbid: RIE
  name: IEEE Electronic Library (IEL)
  url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/
  sourceTypes: Publisher
DeliveryMethod fulltext_linktorsrc
Discipline Engineering
Computer Science
EISSN 1941-0077
EndPage 1996
ExternalDocumentID 10_1109_TMM_2019_2895292
8625482
Genre orig-research
GrantInformation_xml – fundername: National Natural Science Foundation of China
  grantid: 61463052; 61365001
  funderid: 10.13039/501100001809
– fundername: China Postdoctoral Science Foundation
  grantid: 171740
  funderid: 10.13039/501100002858
– fundername: Yunnan University
  grantid: YDY17111
  funderid: 10.13039/501100007839
GroupedDBID -~X
0R~
29I
4.4
5GY
5VS
6IK
97E
AAJGR
AARMG
AASAJ
AAWTH
ABAZT
ABQJQ
ABVLG
ACGFO
ACGFS
ACIWK
AENEX
AETIX
AGQYO
AGSQL
AHBIQ
AI.
AIBXA
AKJIK
AKQYR
ALLEH
ALMA_UNASSIGNED_HOLDINGS
ATWAV
BEFXN
BFFAM
BGNUA
BKEBE
BPEOZ
CS3
DU5
EBS
EJD
HZ~
H~9
IFIPE
IFJZH
IPLJI
JAVBF
LAI
M43
O9-
OCL
P2P
PQQKQ
RIA
RIE
RNS
TN5
VH1
ZY4
AAYXX
CITATION
7SC
7SP
8FD
JQ2
L7M
L~C
L~D
ID FETCH-LOGICAL-c357t-f23c9947ad5e42b28f9599ad08007f4900f5d4a817b389539e0d5af4f5f92fe23
IEDL.DBID RIE
ISSN 1520-9210
IngestDate Mon Jun 30 04:58:45 EDT 2025
Thu Apr 24 23:03:23 EDT 2025
Tue Jul 01 01:54:32 EDT 2025
Wed Aug 27 02:40:12 EDT 2025
IsPeerReviewed true
IsScholarly true
Issue 8
Language English
License https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html
https://doi.org/10.15223/policy-029
https://doi.org/10.15223/policy-037
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c357t-f23c9947ad5e42b28f9599ad08007f4900f5d4a817b389539e0d5af4f5f92fe23
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ORCID 0000-0003-1111-2035
0000-0003-0139-9415
0000-0003-3133-7119
0000-0003-0568-1231
PQID 2261269119
PQPubID 75737
PageCount 15
ParticipantIDs crossref_citationtrail_10_1109_TMM_2019_2895292
ieee_primary_8625482
proquest_journals_2261269119
crossref_primary_10_1109_TMM_2019_2895292
ProviderPackageCode CITATION
AAYXX
PublicationCentury 2000
PublicationDate 2019-08-01
PublicationDateYYYYMMDD 2019-08-01
PublicationDate_xml – month: 08
  year: 2019
  text: 2019-08-01
  day: 01
PublicationDecade 2010
PublicationPlace Piscataway
PublicationPlace_xml – name: Piscataway
PublicationTitle IEEE transactions on multimedia
PublicationTitleAbbrev TMM
PublicationYear 2019
Publisher IEEE
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Publisher_xml – name: IEEE
– name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
References ref13
ref56
ref12
ref15
ref14
ref53
ref52
ref55
ref11
ref54
ref10
ref17
ref16
ref19
ref18
arjovsky (ref43) 2017
paszke (ref49) 0
ref51
ref50
zhang (ref35) 2017
ref46
yan (ref23) 2018
ref45
ref47
nowozin (ref39) 0
ref42
ref44
goodfellow (ref24) 0
ratt (ref6) 2007; 8
kingma (ref48) 2014
ref8
ref7
teichmann (ref29) 2018
ref9
ref4
ref3
ref5
ref40
ref37
ref36
ref31
ref30
ref33
ref32
son (ref34) 2017
ref2
ref1
reed (ref26) 0
dumoulin (ref41) 2016
ref25
ref20
ref22
ref21
salimans (ref38) 0
ref27
mirza (ref28) 0
References_xml – ident: ref55
  doi: 10.1016/j.inffus.2006.09.001
– ident: ref19
  doi: 10.1016/j.inffus.2016.09.006
– year: 2017
  ident: ref35
  article-title: Image de-raining using a conditional generative adversarial network
– ident: ref20
  doi: 10.1016/j.inffus.2016.12.001
– ident: ref33
  doi: 10.1109/CVPR.2015.7299064
– ident: ref9
  doi: 10.1016/j.inffus.2005.09.006
– ident: ref37
  doi: 10.1109/TMI.2004.825627
– ident: ref44
  doi: 10.1016/j.inffus.2017.10.007
– ident: ref31
  doi: 10.1109/ICCV.2017.304
– ident: ref45
  doi: 10.1007/s11263-014-0733-5
– ident: ref42
  doi: 10.1109/CVPR.2016.90
– ident: ref32
  doi: 10.1109/LSP.2016.2618776
– start-page: 2672
  year: 0
  ident: ref24
  article-title: Generative Adversarial Networks
  publication-title: Proc Conf Neural Inf Process Syst
– ident: ref12
  doi: 10.1007/s11760-013-0556-9
– ident: ref21
  doi: 10.1109/ACCESS.2017.2735019
– ident: ref4
  doi: 10.1109/TMM.2017.2760100
– ident: ref30
  doi: 10.1007/s10044-011-0235-9
– ident: ref47
  doi: 10.1109/TIP.2013.2286328
– year: 0
  ident: ref28
  article-title: Conditional generative adversarial nets
– ident: ref53
  doi: 10.1049/el:20000267
– start-page: 271
  year: 0
  ident: ref39
  article-title: f-GAN: Training generative neural samplers using variational divergence minimization
  publication-title: Proc Conf Neural Inf Process Syst
– ident: ref16
  doi: 10.1016/j.patrec.2007.01.013
– ident: ref10
  doi: 10.1007/s11760-012-0361-x
– ident: ref8
  doi: 10.1016/j.sigpro.2012.01.027
– ident: ref22
  doi: 10.1109/TCSVT.2018.2821177
– year: 2017
  ident: ref43
  article-title: Towards principled methods for training generative adversarial networks
– ident: ref27
  doi: 10.1109/CVPR.2017.632
– ident: ref15
  doi: 10.1016/S1566-2535(01)00038-0
– ident: ref3
  doi: 10.1109/TMM.2017.2743988
– ident: ref1
  doi: 10.1109/TMM.2013.2244870
– ident: ref13
  doi: 10.1109/TIP.2013.2244222
– ident: ref5
  doi: 10.1109/TMM.2013.2244871
– ident: ref7
  doi: 10.1016/S0167-8655(01)00047-2
– year: 2018
  ident: ref29
  article-title: Convolutional CRFs for semantic segmentation
– ident: ref40
  doi: 10.1109/ICCV.2017.505
– ident: ref51
  doi: 10.1109/TPAMI.2011.109
– ident: ref36
  doi: 10.1109/CVPR.2016.350
– ident: ref11
  doi: 10.1016/j.sigpro.2009.01.012
– ident: ref46
  doi: 10.1887/0750304359
– ident: ref25
  doi: 10.1109/CVPR.2017.728
– year: 2017
  ident: ref34
  article-title: Retinal vessel segmentation in fundoscopic images with generative adversarial networks
– ident: ref56
  doi: 10.1016/j.imavis.2007.12.002
– volume: 8
  start-page: 114
  year: 2007
  ident: ref6
  article-title: Image fusion: Advances in the state of the art
  publication-title: Inf Fusion
  doi: 10.1016/j.inffus.2006.04.001
– ident: ref2
  doi: 10.1109/TMM.2015.2403612
– ident: ref18
  doi: 10.1016/j.inffus.2011.07.001
– ident: ref50
  doi: 10.1016/j.inffus.2014.10.004
– year: 2018
  ident: ref23
  article-title: Unsupervised deep multi-focus image fusion
  publication-title: arXiv 1806 07272
– start-page: 2234
  year: 0
  ident: ref38
  article-title: Improved techniques for training GANs
  publication-title: Proc Conf Neural Inf Process Syst
– ident: ref14
  doi: 10.1016/j.inffus.2014.05.004
– start-page: 1
  year: 0
  ident: ref49
  article-title: Automatic differentiation in pytorch
  publication-title: Proc Conf Neural Inf Process Syst Workshops
– ident: ref54
  doi: 10.1016/j.inffus.2005.04.003
– year: 2014
  ident: ref48
  article-title: Adam: A method for stochastic optimization
– ident: ref52
  doi: 10.1049/el:20081754
– year: 2016
  ident: ref41
  article-title: A guide to convolution arithmetic for deep learning
– ident: ref17
  doi: 10.1016/j.imavis.2007.10.012
– start-page: 1060
  year: 0
  ident: ref26
  article-title: Generative adversarial text to image synthesis
  publication-title: Proc 33rd Int Conf Mach Learn
SSID ssj0014507
Score 2.6071727
Snippet We study the problem of multi-focus image fusion, where the key challenge is detecting the focused regions accurately among multiple partially focused source...
SourceID proquest
crossref
ieee
SourceType Aggregation Database
Enrichment Source
Index Database
Publisher
StartPage 1982
SubjectTerms Algorithms
Coders
Computer simulation
Computer vision
Conditional generative adversarial network
Conditional random fields
convolutional conditional random fields
Datasets
Gallium nitride
Generative adversarial networks
Generators
Image fusion
Image processing
images-to-image
multi-focus image fusion
Point spread functions
Quantitative analysis
synthesize dataset
Task analysis
Training
Transforms
Visual perception driven algorithms
Title FuseGAN: Learning to Fuse Multi-Focus Image via Conditional Generative Adversarial Network
URI https://ieeexplore.ieee.org/document/8625482
https://www.proquest.com/docview/2261269119
Volume 21
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1LT8MwDLaAExwYTzFeyoELEtm6NFkbbhNiDKTtBBLiUiWNgxCwIdZx4NeTpO3ES4hbVSVtpM9x7Nj-DHAkurFGrROKvIOUizymUqiUGm4UKnQ2SO4jusNRd3DDr27F7QKczGthEDEkn2HLP4ZYvpnkM39V1nbWtzOwncJddI5bWas1jxhwEUqj3XEUUen8mDokGcn29XDoc7hkyzkXgkn25QgKPVV-KOJwuvQbMKzXVSaVPLZmhW7l798oG_-78DVYrcxM0ivlYh0WcLwBjbqFA6l29AasfOIj3IS7_myKF73RKaloV-9JMSH-JQmFurTvfjUll89OCZG3B0XOJj7iHW4TSUlg7bUnCV2ep8rLNhmVeeZbcNM_vz4b0Kr5As1jkRTUsjiXkifKCORMs9RKIaUy3sJMLJdRZIXhKu0k2tk8IpYYGaEst8JKZpHF27A0noxxBwhTItHMzeQ65taJgBFpjrYToYxMajtNaNd4ZHnFTO4bZDxlwUOJZOYQzDyCWYVgE47nM15KVo4_xm56QObjKiyasF9DnlXbdpoxT6jWdfpf7v4-aw-W_bfLDMB9WCpeZ3jgrJJCHwZx_AAUl94V
linkProvider IEEE
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV3PT9swFH5CcAAOFMoQHTB82AVpblPHbmJuCFHajfRUJMQlsuNnNA3aaU134K_HdpKKMTTtFkW2bOl7fu-z3y-Az2IQa9Q6ocj7SLkoYiqFSqnhRqFCx0EK79HNJoPRLf96J-7W4MsqFwYRQ_AZdv1n8OWbebH0T2U9x74dwXYKd8PZfcGqbK2Vz4CLkBztDFJEpbvJNE7JSPamWeajuGTXXS8Ek-wPIxS6qvylioN9GbYga3ZWhZX86C5L3S2e3xRt_N-t78JOTTTJRSUZe7CGsza0miYOpD7Tbdh-VZFwH-6HywVeX0zOSV149YGUc-J_kpCqS4duqQUZPzk1RH5_V-Ry7n3e4T2RVCWsvf4koc_zQnnpJpMq0vwD3A6vppcjWrdfoEUskpJaFhdS8kQZgZxplloppFTGc8zEchlFVhiu0n6iHesRscTICGW5FVYyiyw-gPXZfIaHQJgSiWZuJtcxt04IjEgLtP0IZWRS2-9Ar8EjL-ra5L5FxmMe7iiRzB2CuUcwrxHswNlqxs-qLsc_xu57QFbjaiw6cNxAntcHd5EzX1Jt4CyA_Pj-rFPYHE2zm_xmPPl2BFt-nSoe8BjWy19LPHEcpdSfgmi-ANYT4V8
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=FuseGAN%3A+Learning+to+Fuse+Multi-Focus+Image+via+Conditional+Generative+Adversarial+Network&rft.jtitle=IEEE+transactions+on+multimedia&rft.au=Guo%2C+Xiaopeng&rft.au=Nie%2C+Rencan&rft.au=Cao%2C+Jinde&rft.au=Zhou%2C+Dongming&rft.date=2019-08-01&rft.issn=1520-9210&rft.eissn=1941-0077&rft.volume=21&rft.issue=8&rft.spage=1982&rft.epage=1996&rft_id=info:doi/10.1109%2FTMM.2019.2895292&rft.externalDBID=n%2Fa&rft.externalDocID=10_1109_TMM_2019_2895292
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1520-9210&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1520-9210&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1520-9210&client=summon