Building Type Classification Using CNN-Transformer Cross-Encoder Adaptive Learning From Very High Resolution Satellite Images

Building type information indicates the functional properties of buildings and plays a crucial role in smart city development and urban socioeconomic activities. Existing methods for classifying building types often face challenges in accurately distinguishing buildings between types while maintaini...

Full description

Saved in:
Bibliographic Details
Published inIEEE journal of selected topics in applied earth observations and remote sensing Vol. 18; pp. 976 - 994
Main Authors Zhang, Shaofeng, Li, Mengmeng, Zhao, Wufan, Wang, Xiaoqin, Wu, Qunyong
Format Journal Article
LanguageEnglish
Published Piscataway IEEE 01.01.2025
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
Abstract Building type information indicates the functional properties of buildings and plays a crucial role in smart city development and urban socioeconomic activities. Existing methods for classifying building types often face challenges in accurately distinguishing buildings between types while maintaining well-delineated boundaries, especially in complex urban environments. This study introduces a novel framework, i.e., CNN-Transformer cross-attention feature fusion network (CTCFNet), for building type classification from very high resolution remote sensing images. CTCFNet integrates convolutional neural networks (CNNs) and Transformers using an interactive cross-encoder fusion module that enhances semantic feature learning and improves classification accuracy in complex scenarios. We develop an adaptive collaboration optimization module that applies human visual attention mechanisms to enhance the feature representation of building types and boundaries simultaneously. To address the scarcity of datasets in building type classification, we create two new datasets, i.e., the urban building type (UBT) dataset and the town building type (TBT) dataset, for model evaluation. Extensive experiments on these datasets demonstrate that CTCFNet outperforms popular CNNs, Transformers, and dual-encoder methods in identifying building types across various regions, achieving the highest mean intersection over union of 78.20% and 77.11%, F1 scores of 86.83% and 88.22%, and overall accuracy of 95.07% and 95.73% on the UBT and TBT datasets, respectively. We conclude that CTCFNet effectively addresses the challenges of high interclass similarity and intraclass inconsistency in complex scenes, yielding results with well-delineated building boundaries and accurate building types.
AbstractList Building type information indicates the functional properties of buildings and plays a crucial role in smart city development and urban socioeconomic activities. Existing methods for classifying building types often face challenges in accurately distinguishing buildings between types while maintaining well-delineated boundaries, especially in complex urban environments. This study introduces a novel framework, i.e., CNN-Transformer cross-attention feature fusion network (CTCFNet), for building type classification from very high resolution remote sensing images. CTCFNet integrates convolutional neural networks (CNNs) and Transformers using an interactive cross-encoder fusion module that enhances semantic feature learning and improves classification accuracy in complex scenarios. We develop an adaptive collaboration optimization module that applies human visual attention mechanisms to enhance the feature representation of building types and boundaries simultaneously. To address the scarcity of datasets in building type classification, we create two new datasets, i.e., the urban building type (UBT) dataset and the town building type (TBT) dataset, for model evaluation. Extensive experiments on these datasets demonstrate that CTCFNet outperforms popular CNNs, Transformers, and dual-encoder methods in identifying building types across various regions, achieving the highest mean intersection over union of 78.20% and 77.11%, F1 scores of 86.83% and 88.22%, and overall accuracy of 95.07% and 95.73% on the UBT and TBT datasets, respectively. We conclude that CTCFNet effectively addresses the challenges of high interclass similarity and intraclass inconsistency in complex scenes, yielding results with well-delineated building boundaries and accurate building types.
Author Li, Mengmeng
Wang, Xiaoqin
Zhao, Wufan
Zhang, Shaofeng
Wu, Qunyong
Author_xml – sequence: 1
  givenname: Shaofeng
  orcidid: 0009-0001-0689-264X
  surname: Zhang
  fullname: Zhang, Shaofeng
  email: 225520031@fzu.edu.cn
  organization: Key Laboratory of Spatial Data Mining and Information Sharing of Ministry of Education, Academy of Digital China, Fuzhou University, Fuzhou, China
– sequence: 2
  givenname: Mengmeng
  orcidid: 0000-0002-9083-0475
  surname: Li
  fullname: Li, Mengmeng
  email: mli@fzu.edu.cn
  organization: Key Laboratory of Spatial Data Mining and Information Sharing of Ministry of Education, Academy of Digital China, Fuzhou University, Fuzhou, China
– sequence: 3
  givenname: Wufan
  orcidid: 0000-0002-0265-3465
  surname: Zhao
  fullname: Zhao, Wufan
  email: wufanzhao@hkust-gz.edu.cn
  organization: Urban Governance and Design Thrust, Society Hub, The Hong Kong University of Science and Technology (Guangzhou), Guangzhou, China
– sequence: 4
  givenname: Xiaoqin
  surname: Wang
  fullname: Wang, Xiaoqin
  email: wxq@fzu.edu.cn
  organization: Key Laboratory of Spatial Data Mining and Information Sharing of Ministry of Education, Academy of Digital China, Fuzhou University, Fuzhou, China
– sequence: 5
  givenname: Qunyong
  surname: Wu
  fullname: Wu, Qunyong
  email: qywu@fzu.edu.cn
  organization: Key Laboratory of Spatial Data Mining and Information Sharing of Ministry of Education, Academy of Digital China, Fuzhou University, Fuzhou, China
BookMark eNpNkUGP0zAQhS20SHQXfgEcLHFO8cSOHR9LtMsWVYu07XK1HGdSXKVxsdOVeuC_k2xWiNNoxu99Y827Jld96JGQj8CWAEx_-b7drR63y5zlYskLBlKVb8gihwIyKHhxRRaguc5AMPGOXKd0YEzmSvMF-fP17LvG93u6u5yQVp1Nybfe2cGHnj6l6aV6eMh20fapDfGIkVYxpJTd9i40Y7dq7Gnwz0g3aGM_6e9iONKfGC_03u9_0UdMoTu_8LZ2wK7zA9L10e4xvSdvW9sl_PBab8jT3e2uus82P76tq9UmcyLXQ2YbZ12TiwZbbJiCVkKtLFcNY6CEqyUX2GoHVpQgSlFYVpeFdiUKdE5JwW_IeuY2wR7MKfqjjRcTrDcvgxD3xsbBuw6NVlrnsi4lK2pR1mWtNObIpNCcSazdyPo8s04x_D5jGswhnGM_ft9wECABdKFGFZ9VbjpWxPbfVmBmyszMmZkpM_Oa2ej6NLs8Iv7nUIVUTPO_JGKV_A
CODEN IJSTHZ
Cites_doi 10.1109/TGRS.2023.3320146
10.1016/j.isprsjprs.2016.10.007
10.1109/LGRS.2018.2802944
10.1109/ICAS49788.2021.9551165
10.1016/j.isprsjprs.2016.10.001
10.1109/JSTARS.2020.3043442
10.1109/TGRS.2023.3290817
10.1080/15481603.2022.2101727
10.1109/TGRS.2023.3314641
10.1109/TGRS.2023.3282926
10.1109/CVPRW56347.2022.00147
10.1016/j.isprsjprs.2023.04.019
10.1016/j.isprsjprs.2013.09.004
10.1016/j.isprsjprs.2018.02.006
10.1109/ICCV48922.2021.00083
10.1016/j.scs.2022.104009
10.1080/19479832.2015.1051138
10.1016/j.isprsjprs.2024.08.008
10.1109/JSTARS.2021.3119654
10.1109/TGRS.2022.3152575
10.1007/s10489-016-0762-6
10.1016/j.rse.2019.04.014
10.1109/TGRS.2018.2858817
10.24963/ijcai.2023/80
10.1109/CVPR.2017.660
10.1016/j.media.2024.103280
10.3390/rs13214441
10.1016/j.isprsjprs.2020.01.013
10.1007/978-3-030-87193-2_2
10.1109/IGARSS.2017.8127684
10.1109/CVPR52688.2022.01170
10.1016/j.isprsjprs.2022.06.008
10.1016/j.isprsjprs.2015.03.011
10.1109/LGRS.2023.3262586
10.1109/TGRS.2022.3186634
10.1109/JSTARS.2024.3382636
10.1109/TPAMI.2020.2983686
10.1109/IGARSS52108.2023.10281473
10.1109/TGRS.2023.3329152
10.3390/rs14030564
10.1007/978-3-030-00928-1_48
10.3390/rs15163972
10.1016/j.rse.2010.12.017
ContentType Journal Article
Copyright Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2025
Copyright_xml – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2025
DBID 97E
ESBDL
RIA
RIE
AAYXX
CITATION
7UA
8FD
C1K
F1W
FR3
H8D
H96
KR7
L.G
L7M
DOA
DOI 10.1109/JSTARS.2024.3501678
DatabaseName IEEE All-Society Periodicals Package (ASPP) 2005–Present
IEEE Xplore Open Access Journals
IEEE All-Society Periodicals Package (ASPP) 1998–Present
IEEE Electronic Library (IEL)
CrossRef
Water Resources Abstracts
Technology Research Database
Environmental Sciences and Pollution Management
ASFA: Aquatic Sciences and Fisheries Abstracts
Engineering Research Database
Aerospace Database
Aquatic Science & Fisheries Abstracts (ASFA) 2: Ocean Technology, Policy & Non-Living Resources
Civil Engineering Abstracts
Aquatic Science & Fisheries Abstracts (ASFA) Professional
Advanced Technologies Database with Aerospace
DOAJ Directory of Open Access Journals (ODIN)
DatabaseTitle CrossRef
Aerospace Database
Civil Engineering Abstracts
Aquatic Science & Fisheries Abstracts (ASFA) Professional
Aquatic Science & Fisheries Abstracts (ASFA) 2: Ocean Technology, Policy & Non-Living Resources
Technology Research Database
ASFA: Aquatic Sciences and Fisheries Abstracts
Engineering Research Database
Advanced Technologies Database with Aerospace
Water Resources Abstracts
Environmental Sciences and Pollution Management
DatabaseTitleList

Aerospace Database
Database_xml – sequence: 1
  dbid: DOA
  name: DOAJ Directory of Open Access Journals (ODIN)
  url: https://www.doaj.org/
  sourceTypes: Open Website
– sequence: 2
  dbid: RIE
  name: IEEE Electronic Library (IEL)
  url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/
  sourceTypes: Publisher
DeliveryMethod fulltext_linktorsrc
Discipline Architecture
Geology
EISSN 2151-1535
EndPage 994
ExternalDocumentID oai_doaj_org_article_979926b8605b48b8b79e2e0649306ebc
10_1109_JSTARS_2024_3501678
10756709
Genre orig-research
GrantInformation_xml – fundername: National Natural Science Foundation of China
  grantid: 42471378
  funderid: 10.13039/501100001809
GroupedDBID 0R~
29I
4.4
5GY
5VS
6IK
97E
AAFWJ
AAJGR
AASAJ
AAWTH
ABAZT
ABVLG
ACIWK
AENEX
AETIX
AFPKN
AFRAH
AGSQL
ALMA_UNASSIGNED_HOLDINGS
BEFXN
BFFAM
BGNUA
BKEBE
BPEOZ
DU5
EBS
EJD
ESBDL
GROUPED_DOAJ
HZ~
IFIPE
IPLJI
JAVBF
M43
O9-
OCL
OK1
RIA
RIE
RNS
AAYXX
CITATION
RIG
7UA
8FD
C1K
F1W
FR3
H8D
H96
KR7
L.G
L7M
ID FETCH-LOGICAL-c429t-adcacd24defed071f61b7a37d00174cb634ef9c1a4814845a0b859c8e4ecc7643
IEDL.DBID DOA
ISSN 1939-1404
IngestDate Wed Aug 27 01:24:25 EDT 2025
Fri Jul 25 23:07:24 EDT 2025
Tue Jul 01 03:16:38 EDT 2025
Wed Aug 27 02:33:11 EDT 2025
IsDoiOpenAccess true
IsOpenAccess true
IsPeerReviewed true
IsScholarly true
Language English
License https://creativecommons.org/licenses/by/4.0/legalcode
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c429t-adcacd24defed071f61b7a37d00174cb634ef9c1a4814845a0b859c8e4ecc7643
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ORCID 0000-0002-0265-3465
0009-0001-0689-264X
0000-0002-9083-0475
OpenAccessLink https://doaj.org/article/979926b8605b48b8b79e2e0649306ebc
PQID 3141611957
PQPubID 75722
PageCount 19
ParticipantIDs doaj_primary_oai_doaj_org_article_979926b8605b48b8b79e2e0649306ebc
ieee_primary_10756709
crossref_primary_10_1109_JSTARS_2024_3501678
proquest_journals_3141611957
ProviderPackageCode CITATION
AAYXX
PublicationCentury 2000
PublicationDate 2025-01-01
PublicationDateYYYYMMDD 2025-01-01
PublicationDate_xml – month: 01
  year: 2025
  text: 2025-01-01
  day: 01
PublicationDecade 2020
PublicationPlace Piscataway
PublicationPlace_xml – name: Piscataway
PublicationTitle IEEE journal of selected topics in applied earth observations and remote sensing
PublicationTitleAbbrev JSTARS
PublicationYear 2025
Publisher IEEE
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Publisher_xml – name: IEEE
– name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
References ref13
ref35
ref12
ref34
ref15
ref37
ref14
ref36
ref30
ref11
ref33
ref10
ref32
ref2
ref1
ref17
ref39
ref16
ref38
ref19
ref18
Tan (ref31) 2021; 139
Weber (ref3) 2020
ref24
ref46
ref23
ref45
ref26
ref25
ref47
ref20
ref42
ref41
ref22
ref21
Etten (ref29) 2018
ref43
ref28
ref27
ref8
ref7
Xie (ref44) 2021; 34
ref9
ref4
ref6
ref5
ref40
References_xml – ident: ref9
  doi: 10.1109/TGRS.2023.3320146
– ident: ref14
  doi: 10.1016/j.isprsjprs.2016.10.007
– ident: ref40
  doi: 10.1109/LGRS.2018.2802944
– ident: ref37
  doi: 10.1109/ICAS49788.2021.9551165
– ident: ref8
  doi: 10.1016/j.isprsjprs.2016.10.001
– ident: ref24
  doi: 10.1109/JSTARS.2020.3043442
– ident: ref39
  doi: 10.1109/TGRS.2023.3290817
– ident: ref4
  doi: 10.1080/15481603.2022.2101727
– ident: ref10
  doi: 10.1109/TGRS.2023.3314641
– ident: ref11
  doi: 10.1109/TGRS.2023.3282926
– ident: ref27
  doi: 10.1109/CVPRW56347.2022.00147
– ident: ref23
  doi: 10.1016/j.isprsjprs.2023.04.019
– ident: ref13
  doi: 10.1016/j.isprsjprs.2013.09.004
– ident: ref16
  doi: 10.1016/j.isprsjprs.2018.02.006
– ident: ref36
  doi: 10.1109/ICCV48922.2021.00083
– ident: ref1
  doi: 10.1016/j.scs.2022.104009
– ident: ref6
  doi: 10.1080/19479832.2015.1051138
– ident: ref18
  doi: 10.1016/j.isprsjprs.2024.08.008
– ident: ref21
  doi: 10.1109/JSTARS.2021.3119654
– year: 2018
  ident: ref29
  article-title: SpaceNet: A remote sensing dataset and challenge series
– ident: ref22
  doi: 10.1109/TGRS.2022.3152575
– ident: ref7
  doi: 10.1007/s10489-016-0762-6
– volume: 34
  start-page: 12077
  volume-title: Proc. Int. Conf. Adv. Neural Inf. Process. Syst.
  year: 2021
  ident: ref44
  article-title: SegFormer: Simple and efficient design for semantic segmentation with transformers
– ident: ref2
  doi: 10.1016/j.rse.2019.04.014
– ident: ref26
  doi: 10.1109/TGRS.2018.2858817
– ident: ref35
  doi: 10.24963/ijcai.2023/80
– ident: ref43
  doi: 10.1109/CVPR.2017.660
– ident: ref46
  doi: 10.1016/j.media.2024.103280
– ident: ref45
  doi: 10.3390/rs13214441
– year: 2020
  ident: ref3
  article-title: Building disaster damage assessment in satellite imagery with multi-temporal fusion
– ident: ref38
  doi: 10.1016/j.isprsjprs.2020.01.013
– ident: ref47
  doi: 10.1007/978-3-030-87193-2_2
– ident: ref28
  doi: 10.1109/IGARSS.2017.8127684
– ident: ref32
  doi: 10.1109/CVPR52688.2022.01170
– ident: ref41
  doi: 10.1016/j.isprsjprs.2022.06.008
– ident: ref15
  doi: 10.1016/j.isprsjprs.2015.03.011
– ident: ref17
  doi: 10.1109/LGRS.2023.3262586
– ident: ref5
  doi: 10.1109/TGRS.2022.3186634
– ident: ref30
  doi: 10.1109/JSTARS.2024.3382636
– ident: ref42
  doi: 10.1109/TPAMI.2020.2983686
– ident: ref19
  doi: 10.1109/IGARSS52108.2023.10281473
– ident: ref20
  doi: 10.1109/TGRS.2023.3329152
– ident: ref25
  doi: 10.3390/rs14030564
– ident: ref34
  doi: 10.1007/978-3-030-00928-1_48
– ident: ref33
  doi: 10.3390/rs15163972
– volume: 139
  start-page: 10096
  volume-title: Proc. Int. Conf. Learn. Representations
  year: 2021
  ident: ref31
  article-title: EfficientNetV2: Smaller models and faster training
– ident: ref12
  doi: 10.1016/j.rse.2010.12.017
SSID ssj0062793
Score 2.393226
Snippet Building type information indicates the functional properties of buildings and plays a crucial role in smart city development and urban socioeconomic...
SourceID doaj
proquest
crossref
ieee
SourceType Open Website
Aggregation Database
Index Database
Publisher
StartPage 976
SubjectTerms Accuracy
Architecture
Artificial neural networks
Attention
Boundaries
Building type classification
Buildings
Classification
CNN-transformer networks
Coders
cross-encoder
Datasets
Earth
Feature extraction
feature interaction
High resolution
Image enhancement
Image resolution
Machine learning
Modules
Neural networks
Optimization
Remote sensing
Satellite imagery
Semantics
Transformers
Urban environments
very high resolution remote sensing
Visual discrimination learning
Visual perception
Visualization
SummonAdditionalLinks – databaseName: IEEE Electronic Library (IEL)
  dbid: RIE
  link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1La9wwEBZNoFAKbZqmZPMoOvRYbf2Q9ThulmzTHvbQJCU3oce4h5Ld4HgPKeS_RyPLZWkp9GaMjS2-0Tw0M98Q8sE2Eq2AYuABR5j5mlkbagZKuKJxRShsqvJdiotr_vWmucnN6qkXBgBS8RlM8TLl8sPab_CoLO5w2SDf2A7ZiZHb0Kw1ql1RycSwGx0SzZAzJlMMlYX-FGV89u0yBoMVn2IiTeBQtS0zlNj683iVv3RyMjSL12Q5_uJQX_Jzuund1P_6g73xv9ewR15ll5POBhl5Q57Bap-8nG1lEPbJ889pwu_DW_J4lgdlUwxRaRqaieVECUGaKgzofLlkV6PHCx2d4xrZ-Qr74zs6C_YOlSjN3K0_6KJb39Lv0D1QLCuhmDIYBJ5e2sQI2gP9chs12_0BuV6cX80vWJ7RwHy0ZD2zwVsfKh6ghRDdlVaUTtpaBjR_3DtRc2i1Ly1XMfDijS2carRXwKPsyOgOvSO7q_UKDgn1UNVSaRe8D9Gt0aoQbelaLUpbFtaqCfk4QmbuBioOk0KYQpsBYYMIm4zwhJwhrL8fRR7tdCPCYfK2NJjUrIRTMahzXDnlpIYKopumYygFzk_IAUK49b0BvQk5GaXE5E1_b-oSo8VSN_LoH68dkxcVzg9ORzgnZLfvNnAanZrevU_C_ASXAPPe
  priority: 102
  providerName: IEEE
Title Building Type Classification Using CNN-Transformer Cross-Encoder Adaptive Learning From Very High Resolution Satellite Images
URI https://ieeexplore.ieee.org/document/10756709
https://www.proquest.com/docview/3141611957
https://doaj.org/article/979926b8605b48b8b79e2e0649306ebc
Volume 18
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV25TsQwELUQEhIN4hTLJReUGHI4PsplxXIUW3CJzvIxoWJBYSko-Hc8jhetREFDG0WKNTOZeWOP3yPk2DYSq4Bi4AElzHzNrA01AyVc0bgiFDZN-U7E1QO_eWqeFqS-cCaspwfuDXeGx06VcCrCbseVU05qqCAWUh3BLjiP2TfWvHkz1edgUclEtxvRiWZIIJP5hspCn8WAH97exc6w4qd4qiZQYW2hJiXq_qy18itBp6ozXidrGS7SYb_MDbIE002ycpnkeD-3yNd5VrWm2E_SpHCJsz_J3DSNA9DRZMLu5_AUOjrCNbCLKV5m7-gw2DfMeDQTrT7Tcff6Qh-h-6Q4A0Jxf7-PTnpnE33nDOj1S0xD79vkYXxxP7piWVCB-Vh2ZswGb32oeIAWQsQWrSidtLUMWKu4d6Lm0GpfWq5il8QbWzjVaK-AR0fLiF12yPL0dQq7hHqoaqm0C96HiEG0KkRbulaL0paFtWpATuYmNW89b4ZJ_UahTe8Bgx4w2QMDco5m_3kVSa_TgxgKJoeC-SsUBmQbnbbwPdkgPd2AHMy9aPIf-m7qElu7Ujdy7z--vU9WK1QGTpszB2R51n3AYYQrM3eUIvMo3Sz8BvxQ5Ow
linkProvider Directory of Open Access Journals
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwzV3LbtNAFL0qRQiExKMUESgwC9jh4LdnFizS0JDQkgVNUXfDPK67qJpUjiMUJD6FX-HbmDu2qwjEshI7y_JDHp_78r0-B-CVygqKAjxAgyRhZpJAKZsEyHMdZjq0ofJTvtN8fJJ-PM1Ot-Dn1b8wiOiHz7BPm76XbxdmRZ_KnIUXGfGNtTOUh7j-5iq05bvJe_c6X8fx6GA2HAetiEBgnKutA2WNMjZOLZZoXTwt80gXKiks-efU6DxJsRQmUil3lUGaqVDzTBiOqXu4wsVrd90bcNMlGlnc_B7WOfo8Ljynr0uBREAsNS2pURSKt86qBp-PXfkZp31q3eUk47YR-Lw-QCvo8lcU8KFtdB9-dYvSTLSc91e17pvvf_BF_rer9gDutUk1GzRW8BC2cL4DdwcbPZIduPXBaxivH8GP_VYKnFERzrwsKA1MeYwyP0PBhtNpMOtyeqzYkNY0OJgTA0DFBlZdUphgLTvtGRtViwv2Bas1o8EZRk2RxqTZsfKcpzWyyYXz3ctdOLmWpXgM2_PFHJ8AMxgnBRfaGmNd4iZ4mJeRLkUeqShUivfgTQcRedmQjUhfpIVCNoiShCjZIqoH-wSjq0OJKdzvcK9fto5HUts2zjV3ZatOuea6EBijS0SFKxZRmx7sEmQ27tegpQd7HSpl69aWMomoHo5EVjz9x2kv4fZ49ulIHk2mh8_gTkxqyf6D1R5s19UKn7sUrtYvvCEx-HrdGPwNIOdTSQ
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Building+Type+Classification+Using+CNN-Transformer+Cross-Encoder+Adaptive+Learning+From+Very+High+Resolution+Satellite+Images&rft.jtitle=IEEE+journal+of+selected+topics+in+applied+earth+observations+and+remote+sensing&rft.au=Shaofeng+Zhang&rft.au=Mengmeng+Li&rft.au=Wufan+Zhao&rft.au=Xiaoqin+Wang&rft.date=2025-01-01&rft.pub=IEEE&rft.issn=1939-1404&rft.eissn=2151-1535&rft.volume=18&rft.spage=976&rft.epage=994&rft_id=info:doi/10.1109%2FJSTARS.2024.3501678&rft.externalDBID=DOA&rft.externalDocID=oai_doaj_org_article_979926b8605b48b8b79e2e0649306ebc
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1939-1404&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1939-1404&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1939-1404&client=summon