Unsupervised Change Detection From Heterogeneous Data Based on Image Translation

It is quite an important and challenging problem for change detection (CD) from heterogeneous remote sensing images. The images obtained from different sensors (i.e., synthetic aperture radar (SAR) & optical camera) characterize the distinct properties of objects. Thus, it is impossible to detec...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on geoscience and remote sensing Vol. 60; pp. 1 - 13
Main Authors Liu, Zhun-Ga, Zhang, Zuo-Wei, Pan, Quan, Ning, Liang-Bo
Format Journal Article
LanguageEnglish
Published New York IEEE 2022
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
Abstract It is quite an important and challenging problem for change detection (CD) from heterogeneous remote sensing images. The images obtained from different sensors (i.e., synthetic aperture radar (SAR) & optical camera) characterize the distinct properties of objects. Thus, it is impossible to detect changes by direct comparison of heterogeneous images. In this article, a new unsupervised change detection (USCD) method is proposed based on image translation. The cycle-consistent adversarial networks (CycleGANs) are employed to learn the subimage to subimage mapping relation using the given pair (i.e., before and after the event) of heterogeneous images from which the changes will be detected. Then, we can translate one image (e.g., SAR) from its original feature space (e.g., SAR) to another space (e.g., optical). By doing this, the pair of images can be represented in a common feature space (e.g., optical). The pixels with close pattern values in the before-event image may have quite different values in the after-event image if the change happens on some ones. Thus, we can generate the difference map between the translated before-event image and the original after-event image. Then, the difference map is divided into changed and unchanged parts. However, these detection results are not very reliable. We will select some significantly changed and unchanged pixel pairs from the two parts with the clustering technique (i.e., <inline-formula> <tex-math notation="LaTeX">K </tex-math></inline-formula>-means). These selected pixel pairs are used to learn a binary classifier, and the other pixel pairs will be classified by this classifier to obtain the final CD results. Experimental results on different real datasets demonstrate the effectiveness of the proposed USCD method compared with several other related methods.
AbstractList It is quite an important and challenging problem for change detection (CD) from heterogeneous remote sensing images. The images obtained from different sensors (i.e., synthetic aperture radar (SAR) & optical camera) characterize the distinct properties of objects. Thus, it is impossible to detect changes by direct comparison of heterogeneous images. In this article, a new unsupervised change detection (USCD) method is proposed based on image translation. The cycle-consistent adversarial networks (CycleGANs) are employed to learn the subimage to subimage mapping relation using the given pair (i.e., before and after the event) of heterogeneous images from which the changes will be detected. Then, we can translate one image (e.g., SAR) from its original feature space (e.g., SAR) to another space (e.g., optical). By doing this, the pair of images can be represented in a common feature space (e.g., optical). The pixels with close pattern values in the before-event image may have quite different values in the after-event image if the change happens on some ones. Thus, we can generate the difference map between the translated before-event image and the original after-event image. Then, the difference map is divided into changed and unchanged parts. However, these detection results are not very reliable. We will select some significantly changed and unchanged pixel pairs from the two parts with the clustering technique (i.e., <inline-formula> <tex-math notation="LaTeX">K </tex-math></inline-formula>-means). These selected pixel pairs are used to learn a binary classifier, and the other pixel pairs will be classified by this classifier to obtain the final CD results. Experimental results on different real datasets demonstrate the effectiveness of the proposed USCD method compared with several other related methods.
It is quite an important and challenging problem for change detection (CD) from heterogeneous remote sensing images. The images obtained from different sensors (i.e., synthetic aperture radar (SAR) & optical camera) characterize the distinct properties of objects. Thus, it is impossible to detect changes by direct comparison of heterogeneous images. In this article, a new unsupervised change detection (USCD) method is proposed based on image translation. The cycle-consistent adversarial networks (CycleGANs) are employed to learn the subimage to subimage mapping relation using the given pair (i.e., before and after the event) of heterogeneous images from which the changes will be detected. Then, we can translate one image (e.g., SAR) from its original feature space (e.g., SAR) to another space (e.g., optical). By doing this, the pair of images can be represented in a common feature space (e.g., optical). The pixels with close pattern values in the before-event image may have quite different values in the after-event image if the change happens on some ones. Thus, we can generate the difference map between the translated before-event image and the original after-event image. Then, the difference map is divided into changed and unchanged parts. However, these detection results are not very reliable. We will select some significantly changed and unchanged pixel pairs from the two parts with the clustering technique (i.e., [Formula Omitted]-means). These selected pixel pairs are used to learn a binary classifier, and the other pixel pairs will be classified by this classifier to obtain the final CD results. Experimental results on different real datasets demonstrate the effectiveness of the proposed USCD method compared with several other related methods.
Author Pan, Quan
Liu, Zhun-Ga
Zhang, Zuo-Wei
Ning, Liang-Bo
Author_xml – sequence: 1
  givenname: Zhun-Ga
  orcidid: 0000-0001-7144-7449
  surname: Liu
  fullname: Liu, Zhun-Ga
  email: liuzhunga@nwpu.edu.cn
  organization: School of Automation, Northwestern Polytechnical University, Xi'an, China
– sequence: 2
  givenname: Zuo-Wei
  orcidid: 0000-0002-4855-5924
  surname: Zhang
  fullname: Zhang, Zuo-Wei
  email: zuowei_zhang@mail.nwpu.edu.cn
  organization: School of Automation, Northwestern Polytechnical University, Xi'an, China
– sequence: 3
  givenname: Quan
  surname: Pan
  fullname: Pan, Quan
  email: quanpan@nwpu.edu.cn
  organization: School of Automation, Northwestern Polytechnical University, Xi'an, China
– sequence: 4
  givenname: Liang-Bo
  surname: Ning
  fullname: Ning, Liang-Bo
  email: ninglb@mail.nwpu.edu.cn
  organization: School of Automation, Northwestern Polytechnical University, Xi'an, China
BookMark eNp9kEFLwzAUx4NMcJt-APFS8Nz5krRLc9TNbcJA0e0c0vR1dmzJTFrBb2_rhgcPnh4P_r_35_0GpGedRUKuKYwoBXm3mr--jRgwOuIghaDijPRpmmYxjJOkR_pA5ThmmWQXZBDCFoAmKRV98rK2oTmg_6wCFtHkXdsNRlOs0dSVs9HMu320aFfvNmjRNSGa6lpHD7qLt4GnvW6Bldc27HSHXJLzUu8CXp3mkKxnj6vJIl4-z58m98vYMMnrOAeZlRxYnlIQppRSy7zAROSGoykSLNFwYzjNoTRJwZgphKRMUK2z1ABP-ZDcHu8evPtoMNRq6xpv20rFxlSmEiCTbYoeU8a7EDyW6uCrvfZfioLqxKlOnOrEqZO4lhF_GFPVP7_VXle7f8mbI1kh4m-TTKRIIePf8oB91Q
CODEN IGRSD2
CitedBy_id crossref_primary_10_1109_TGRS_2023_3294884
crossref_primary_10_1109_TGRS_2023_3294300
crossref_primary_10_1109_TGRS_2024_3524639
crossref_primary_10_1109_TGRS_2024_3491826
crossref_primary_10_1109_TNNLS_2022_3184414
crossref_primary_10_1109_JSTARS_2024_3479703
crossref_primary_10_1109_TGRS_2023_3314217
crossref_primary_10_1109_TGRS_2024_3421581
crossref_primary_10_1007_s13042_024_02176_6
crossref_primary_10_1109_LGRS_2023_3333354
crossref_primary_10_1109_LGRS_2022_3217348
crossref_primary_10_1109_LGRS_2023_3318593
crossref_primary_10_1109_TGRS_2024_3403727
crossref_primary_10_1117_1_JRS_18_024516
crossref_primary_10_1109_TGRS_2023_3312344
crossref_primary_10_1109_JSTARS_2024_3413715
crossref_primary_10_1109_TAI_2024_3357667
crossref_primary_10_1080_10106049_2024_2329673
crossref_primary_10_1080_10095020_2024_2446307
crossref_primary_10_1109_TGRS_2023_3330494
crossref_primary_10_1109_JSTARS_2023_3288294
crossref_primary_10_1109_TGRS_2022_3200996
crossref_primary_10_1109_TIP_2025_3539461
crossref_primary_10_1109_JPROC_2022_3219376
crossref_primary_10_1109_JSTARS_2024_3411691
crossref_primary_10_1109_JSTARS_2024_3416183
crossref_primary_10_1109_TGRS_2024_3515258
crossref_primary_10_1109_TGRS_2023_3305334
crossref_primary_10_1109_TGRS_2023_3235981
crossref_primary_10_1109_TGRS_2022_3209972
crossref_primary_10_1109_TGRS_2025_3526211
crossref_primary_10_1080_17538947_2024_2398051
crossref_primary_10_1109_TGRS_2022_3203897
crossref_primary_10_1109_TGRS_2022_3221489
crossref_primary_10_1109_TGRS_2023_3320805
crossref_primary_10_1109_JSTARS_2024_3461988
crossref_primary_10_1109_TGRS_2022_3165371
Cites_doi 10.1109/TGRS.2019.2930348
10.1109/JSTARS.2019.2934602
10.1109/TIP.2017.2781375
10.1007/978-3-030-01240-3_12
10.1016/j.rse.2007.10.002
10.1109/TGRS.2013.2288271
10.1109/TGRS.2016.2594952
10.1109/TGRS.2014.2360097
10.1109/TIP.2006.888195
10.14358/PERS.81.8.637
10.1177/001316448104100307
10.1007/978-3-030-01261-8_12
10.1109/TGRS.2017.2739800
10.1016/j.sigpro.2014.09.009
10.1016/j.rse.2015.01.006
10.1109/TFUZZ.2013.2249072
10.1109/TIP.2017.2784560
10.1109/ICCV.2019.01065
10.1109/34.506411
10.1016/j.asoc.2015.10.044
10.1109/TGRS.2010.2066979
10.1109/ICCV.2017.244
10.1109/JPROC.2012.2197169
10.3389/fnbot.2013.00021
10.1016/j.neuroimage.2017.09.001
10.1109/JSTARS.2010.2060316
10.1109/TGRS.2013.2266673
10.1109/TGRS.2009.2038274
10.1109/LGRS.2010.2068537
10.1109/TGRS.2016.2516402
10.1007/978-3-030-01219-9_11
10.1109/TGRS.2014.2321145
10.1016/j.isprsjprs.2013.03.006
10.1109/CVPR.2018.00016
10.1016/j.rse.2010.02.018
10.1016/S0034-4257(97)00049-7
10.1016/j.rse.2014.01.011
10.1109/ICCV.2017.310
10.1023/A:I0I0933404324
10.1109/CVPR.2018.00541
10.1109/TGRS.2008.916476
10.1109/TPAMI.2020.2982166
10.1109/LGRS.2018.2843385
10.1109/TIP.2008.916047
10.1080/714110283
10.1109/CVPR.2019.00737
10.1109/TNNLS.2016.2636227
10.1016/j.patrec.2009.09.011
10.1109/JSTARS.2020.2964409
10.1109/TIP.2014.2378053
10.1109/JSTARS.2014.2347171
ContentType Journal Article
Copyright Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2022
Copyright_xml – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2022
DBID 97E
RIA
RIE
AAYXX
CITATION
7UA
8FD
C1K
F1W
FR3
H8D
H96
KR7
L.G
L7M
DOI 10.1109/TGRS.2021.3097717
DatabaseName IEEE All-Society Periodicals Package (ASPP) 2005–Present
IEEE All-Society Periodicals Package (ASPP) 1998–Present
IEEE Electronic Library (IEL)
CrossRef
Water Resources Abstracts
Technology Research Database
Environmental Sciences and Pollution Management
ASFA: Aquatic Sciences and Fisheries Abstracts
Engineering Research Database
Aerospace Database
Aquatic Science & Fisheries Abstracts (ASFA) 2: Ocean Technology, Policy & Non-Living Resources
Civil Engineering Abstracts
Aquatic Science & Fisheries Abstracts (ASFA) Professional
Advanced Technologies Database with Aerospace
DatabaseTitle CrossRef
Aerospace Database
Civil Engineering Abstracts
Aquatic Science & Fisheries Abstracts (ASFA) Professional
Aquatic Science & Fisheries Abstracts (ASFA) 2: Ocean Technology, Policy & Non-Living Resources
Technology Research Database
ASFA: Aquatic Sciences and Fisheries Abstracts
Engineering Research Database
Advanced Technologies Database with Aerospace
Water Resources Abstracts
Environmental Sciences and Pollution Management
DatabaseTitleList
Aerospace Database
Database_xml – sequence: 1
  dbid: RIE
  name: IEEE/IET Electronic Library
  url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/
  sourceTypes: Publisher
DeliveryMethod fulltext_linktorsrc
Discipline Engineering
Physics
EISSN 1558-0644
EndPage 13
ExternalDocumentID 10_1109_TGRS_2021_3097717
9497508
Genre orig-research
GrantInformation_xml – fundername: Innovation Foundation for Doctor Dissertation of Northwestern Polytechnical University
  grantid: CX201953
  funderid: 10.13039/501100002663
– fundername: National Natural Science Foundation of China
  grantid: 61790554; U20B2067; 61790552
  funderid: 10.13039/501100001809
GroupedDBID -~X
0R~
29I
4.4
5GY
5VS
6IK
97E
AAJGR
AARMG
AASAJ
AAWTH
ABAZT
ABQJQ
ABVLG
ACGFO
ACGFS
ACIWK
ACNCT
AENEX
AETIX
AFRAH
AGQYO
AGSQL
AHBIQ
AI.
AIBXA
AKJIK
AKQYR
ALLEH
ALMA_UNASSIGNED_HOLDINGS
ASUFR
ATWAV
BEFXN
BFFAM
BGNUA
BKEBE
BPEOZ
CS3
DU5
EBS
EJD
F5P
HZ~
H~9
IBMZZ
ICLAB
IFIPE
IFJZH
IPLJI
JAVBF
LAI
M43
O9-
OCL
P2P
RIA
RIE
RNS
RXW
TAE
TN5
VH1
Y6R
AAYOK
AAYXX
CITATION
RIG
7UA
8FD
C1K
F1W
FR3
H8D
H96
KR7
L.G
L7M
ID FETCH-LOGICAL-c293t-b098f302b5107cf99a9bde47bc3ecd4efec3cc31b0fc4d22cd791271aa85c0353
IEDL.DBID RIE
ISSN 0196-2892
IngestDate Tue Aug 26 14:40:26 EDT 2025
Thu Apr 24 23:02:55 EDT 2025
Tue Jul 01 01:34:29 EDT 2025
Wed Aug 27 03:03:15 EDT 2025
IsPeerReviewed true
IsScholarly true
Language English
License https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html
https://doi.org/10.15223/policy-029
https://doi.org/10.15223/policy-037
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c293t-b098f302b5107cf99a9bde47bc3ecd4efec3cc31b0fc4d22cd791271aa85c0353
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ORCID 0000-0001-7144-7449
0000-0002-4855-5924
PQID 2619590089
PQPubID 85465
PageCount 13
ParticipantIDs proquest_journals_2619590089
ieee_primary_9497508
crossref_primary_10_1109_TGRS_2021_3097717
crossref_citationtrail_10_1109_TGRS_2021_3097717
ProviderPackageCode CITATION
AAYXX
PublicationCentury 2000
PublicationDate 20220000
2022-00-00
20220101
PublicationDateYYYYMMDD 2022-01-01
PublicationDate_xml – year: 2022
  text: 20220000
PublicationDecade 2020
PublicationPlace New York
PublicationPlace_xml – name: New York
PublicationTitle IEEE transactions on geoscience and remote sensing
PublicationTitleAbbrev TGRS
PublicationYear 2022
Publisher IEEE
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Publisher_xml – name: IEEE
– name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
References ref13
ref57
ref12
ref56
ref15
ref14
Liu (ref36)
ref53
ref52
ref11
ref55
ref10
ref54
ref17
ref16
ref19
ref18
Denton (ref33)
ref51
ref50
ref46
ref45
ref48
ref42
Radford (ref34) 2015
ref41
ref44
ref49
ref8
ref7
He (ref43)
ref9
ref4
ref3
ref6
ref5
ref35
ref37
ref31
ref30
ref32
ref2
ref1
ref39
ref38
Deng (ref47)
Choi (ref40)
ref24
ref23
ref26
ref25
ref20
ref22
ref21
ref28
ref27
ref29
References_xml – ident: ref29
  doi: 10.1109/TGRS.2019.2930348
– ident: ref28
  doi: 10.1109/JSTARS.2019.2934602
– start-page: 820
  volume-title: Proc. NIPS
  ident: ref43
  article-title: Dual learning for machine translation
– ident: ref51
  doi: 10.1109/TIP.2017.2781375
– ident: ref38
  doi: 10.1007/978-3-030-01240-3_12
– ident: ref21
  doi: 10.1016/j.rse.2007.10.002
– start-page: 1486
  volume-title: Proc. NIPS
  ident: ref33
  article-title: Deep generative image models using a Laplacian pyramid of adversarial networks
– start-page: 8789
  volume-title: Proc. CVPR
  ident: ref40
  article-title: Stargan: Unified generative adversarial networks for multi-domain image-toimage translation
– ident: ref14
  doi: 10.1109/TGRS.2013.2288271
– ident: ref9
  doi: 10.1109/TGRS.2016.2594952
– ident: ref12
  doi: 10.1109/TGRS.2014.2360097
– ident: ref13
  doi: 10.1109/TIP.2006.888195
– start-page: 700
  volume-title: Proc. NIPS
  ident: ref36
  article-title: Unsupervised image-to-image translation networks
– ident: ref3
  doi: 10.14358/PERS.81.8.637
– ident: ref53
  doi: 10.1177/001316448104100307
– ident: ref46
  doi: 10.1007/978-3-030-01261-8_12
– ident: ref30
  doi: 10.1109/TGRS.2017.2739800
– ident: ref23
  doi: 10.1016/j.sigpro.2014.09.009
– ident: ref7
  doi: 10.1016/j.rse.2015.01.006
– volume-title: arXiv:1511.06434
  year: 2015
  ident: ref34
  article-title: Unsupervised representation learning with deep convolutional generative adversarial networks
– ident: ref52
  doi: 10.1109/TFUZZ.2013.2249072
– ident: ref18
  doi: 10.1109/TIP.2017.2784560
– ident: ref41
  doi: 10.1109/ICCV.2019.01065
– ident: ref56
  doi: 10.1109/34.506411
– ident: ref16
  doi: 10.1016/j.asoc.2015.10.044
– ident: ref20
  doi: 10.1109/TGRS.2010.2066979
– ident: ref35
  doi: 10.1109/ICCV.2017.244
– ident: ref8
  doi: 10.1109/JPROC.2012.2197169
– ident: ref57
  doi: 10.3389/fnbot.2013.00021
– ident: ref55
  doi: 10.1016/j.neuroimage.2017.09.001
– ident: ref1
  doi: 10.1109/JSTARS.2010.2060316
– ident: ref19
  doi: 10.1109/TGRS.2013.2266673
– ident: ref26
  doi: 10.1109/TGRS.2009.2038274
– ident: ref25
  doi: 10.1109/LGRS.2010.2068537
– start-page: 6
  volume-title: Proc. CVPR
  ident: ref47
  article-title: Image-image domain adaptation with preserved selfsimilarity and domain-dissimilarity for person reidentification
– ident: ref6
  doi: 10.1109/TGRS.2016.2516402
– ident: ref39
  doi: 10.1007/978-3-030-01219-9_11
– ident: ref10
  doi: 10.1109/TGRS.2014.2321145
– ident: ref17
  doi: 10.1016/j.isprsjprs.2013.03.006
– ident: ref44
  doi: 10.1109/CVPR.2018.00016
– ident: ref4
  doi: 10.1016/j.rse.2010.02.018
– ident: ref54
  doi: 10.1016/S0034-4257(97)00049-7
– ident: ref5
  doi: 10.1016/j.rse.2014.01.011
– ident: ref37
  doi: 10.1109/ICCV.2017.310
– ident: ref50
  doi: 10.1023/A:I0I0933404324
– ident: ref45
  doi: 10.1109/CVPR.2018.00541
– ident: ref27
  doi: 10.1109/TGRS.2008.916476
– ident: ref42
  doi: 10.1109/TPAMI.2020.2982166
– ident: ref32
  doi: 10.1109/LGRS.2018.2843385
– ident: ref15
  doi: 10.1109/TIP.2008.916047
– ident: ref24
  doi: 10.1080/714110283
– ident: ref48
  doi: 10.1109/CVPR.2019.00737
– ident: ref22
  doi: 10.1109/TNNLS.2016.2636227
– ident: ref49
  doi: 10.1016/j.patrec.2009.09.011
– ident: ref31
  doi: 10.1109/JSTARS.2020.2964409
– ident: ref11
  doi: 10.1109/TIP.2014.2378053
– ident: ref2
  doi: 10.1109/JSTARS.2014.2347171
SSID ssj0014517
Score 2.5594547
Snippet It is quite an important and challenging problem for change detection (CD) from heterogeneous remote sensing images. The images obtained from different sensors...
SourceID proquest
crossref
ieee
SourceType Aggregation Database
Enrichment Source
Index Database
Publisher
StartPage 1
SubjectTerms Change detection
Classification
Classifiers
Clustering
Detection
Feature extraction
heterogeneous images
image translation
Laser radar
Object recognition
Optical imaging
Optical properties
Optical sensors
Pixels
Radar polarimetry
Remote sensing
Remote sensors
SAR (radar)
Synthetic aperture radar
Translation
unsupervised change detection (USCD)
Title Unsupervised Change Detection From Heterogeneous Data Based on Image Translation
URI https://ieeexplore.ieee.org/document/9497508
https://www.proquest.com/docview/2619590089
Volume 60
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjR3LSsQwcFBB0INvcX2Rgyexa5s-c_S1roIi6oK3kkynF91WbHvx603S7ioq4q2HSQkzk3k_AA40i4QYC3JUFvtOQAE6QipyuKJIq4Q8j2wc8uY2Go6C66fwaQaOpr0wRGSLz6hvPm0uPyuxMaGyYxEIreCSWZjVjlvbqzXNGASh17VGR452IniXwfRccfx4ef-gPUHu9X1Xmzt2N9mnDrJLVX5IYqteBstwM7lYW1Xy3G9q1cf3bzMb_3vzFVjq7Ex20jLGKsxQsQaLX6YPrsG8rf7Eah3uRkXVvBqpUVHG2oYDdk61LdMq2OCtHLOhqZspNbtR2VTsXNaSnUoDrgGuxloqMav22tK6DRgNLh7Phk63asFBre9rR7kiyX2XK_1EY8yFkEJlFMQKfcIsoJzQR_Q95eYYZJxjFguPx56USYiuH_qbMFeUBW0Bk9rmi1SeqIS0beNxGXOpOJkpN_r_mPfAnSA_xW4OuVmH8ZJaf8QVqaFXauiVdvTqweH0yGs7hOMv4HWD_ylgh_oe7E4onHbPtEqN-2jWpiZi-_dTO7DATb-Djbnswlz91tCetkJqtW_Z7wNP5Njr
linkProvider IEEE
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwzV1Lb9QwEB5VRQg4QB8glpbiQ7kgZZs4Tx96KCzLbl9CsCv1FmxncoEmVZMIwW_pX-l_69jxLqggbpW45TCOFM-XedjfzADsEkRinQr0VJGGXoSR9oRU6HGFCbmEskzsOeTJaTKZR4dn8dkKXC1rYRDRks9waB7tXX5R684cle2JSJCDyxyF8gh_fKcErdmfjkibrzkfv5-9m3huhoCnyZG1nvJFVoY-V4S9VJdCSKEKjFKlQ9RFhCXqUOswUH6po4JzXaQi4GkgZRZr386EIAN_j-KMmPfVYcs7iigOXDF24lHawt2daeCLvdmHT58p9-TBMPQpwLLT0H55PTvG5Q_bbx3a-AlcL7ai57F8HXatGuqft7pE_q97tQaPXSTNDnror8MKVhvw6Lf-ihtw3_JbdbMJH-dV010Yu9hgwfqSCjbC1hLRKja-rM_ZxDCDavqhsO4aNpKtZG-lESeB6TnZXWYde08efArzO_m4Z7Ba1RU-ByYpqk1UmakMKXoLuEy5VBxNHx96vy4H4C-UnWvXad0M_PiW24zLF7nBR27wkTt8DODNcslF32bkX8KbRt9LQafqAWwvEJU7Q9TkJkE2g2Ez8eLvq17Bg8ns5Dg_np4ebcFDbqo77AnTNqy2lx2-pJirVTsW-gy-3DV-bgDQBjjF
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Unsupervised+Change+Detection+From+Heterogeneous+Data+Based+on+Image+Translation&rft.jtitle=IEEE+transactions+on+geoscience+and+remote+sensing&rft.au=Liu%2C+Zhun-Ga&rft.au=Zhang%2C+Zuo-Wei&rft.au=Pan%2C+Quan&rft.au=Ning%2C+Liang-Bo&rft.date=2022&rft.pub=IEEE&rft.issn=0196-2892&rft.volume=60&rft.spage=1&rft.epage=13&rft_id=info:doi/10.1109%2FTGRS.2021.3097717&rft.externalDocID=9497508
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0196-2892&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0196-2892&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0196-2892&client=summon