SCAttNet: Semantic Segmentation Network With Spatial and Channel Attention Mechanism for High-Resolution Remote Sensing Images

High-resolution remote sensing images (HRRSIs) contain substantial ground object information, such as texture, shape, and spatial location. Semantic segmentation, which is an important task for element extraction, has been widely used in processing mass HRRSIs. However, HRRSIs often exhibit large in...

Full description

Saved in:
Bibliographic Details
Published inIEEE geoscience and remote sensing letters Vol. 18; no. 5; pp. 905 - 909
Main Authors Li, Haifeng, Qiu, Kaijian, Chen, Li, Mei, Xiaoming, Hong, Liang, Tao, Chao
Format Journal Article
LanguageEnglish
Published Piscataway IEEE 01.05.2021
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
Abstract High-resolution remote sensing images (HRRSIs) contain substantial ground object information, such as texture, shape, and spatial location. Semantic segmentation, which is an important task for element extraction, has been widely used in processing mass HRRSIs. However, HRRSIs often exhibit large intraclass variance and small interclass variance due to the diversity and complexity of ground objects, thereby bringing great challenges to a semantic segmentation task. In this letter, we propose a new end-to-end semantic segmentation network, which integrates lightweight spatial and channel attention modules that can refine features adaptively. We compare our method with several classic methods on the ISPRS Vaihingen and Potsdam data sets. Experimental results show that our method can achieve better semantic segmentation results. The source codes are available at https://github.com/lehaifeng/SCAttNet .
AbstractList High-resolution remote sensing images (HRRSIs) contain substantial ground object information, such as texture, shape, and spatial location. Semantic segmentation, which is an important task for element extraction, has been widely used in processing mass HRRSIs. However, HRRSIs often exhibit large intraclass variance and small interclass variance due to the diversity and complexity of ground objects, thereby bringing great challenges to a semantic segmentation task. In this letter, we propose a new end-to-end semantic segmentation network, which integrates lightweight spatial and channel attention modules that can refine features adaptively. We compare our method with several classic methods on the ISPRS Vaihingen and Potsdam data sets. Experimental results show that our method can achieve better semantic segmentation results. The source codes are available at https://github.com/lehaifeng/SCAttNet .
Author Qiu, Kaijian
Hong, Liang
Tao, Chao
Li, Haifeng
Chen, Li
Mei, Xiaoming
Author_xml – sequence: 1
  givenname: Haifeng
  orcidid: 0000-0003-1173-6593
  surname: Li
  fullname: Li, Haifeng
  organization: School of Geosciences and Info-Physics, Central South University, Changsha, China
– sequence: 2
  givenname: Kaijian
  surname: Qiu
  fullname: Qiu, Kaijian
  organization: School of Geosciences and Info-Physics, Central South University, Changsha, China
– sequence: 3
  givenname: Li
  orcidid: 0000-0002-4761-5913
  surname: Chen
  fullname: Chen, Li
  organization: School of Geosciences and Info-Physics, Central South University, Changsha, China
– sequence: 4
  givenname: Xiaoming
  surname: Mei
  fullname: Mei, Xiaoming
  organization: School of Geosciences and Info-Physics, Central South University, Changsha, China
– sequence: 5
  givenname: Liang
  surname: Hong
  fullname: Hong, Liang
  organization: College of Tourism and Geographic Science, Yunnan Normal University, Kunming, China
– sequence: 6
  givenname: Chao
  orcidid: 0000-0003-0071-310X
  surname: Tao
  fullname: Tao, Chao
  email: kingtaohao@csu.edu.cn
  organization: School of Geosciences and Info-Physics, Central South University, Changsha, China
BookMark eNp9kM1r3DAQxUVJoPn6A0ovgp69HUmWbeUWluYDtg3sJqQ3o9ijXaW2tJW0lF7yt0fuhhx6yGmGee_Ng98xOXDeISGfGMwYA_V1cbVczThwmHHVNFyVH8gRk7IpQNbsYNpLWUjV_PxIjmN8AuBl09RH5Hk1v0jpB6ZzusJRu2S7vKxHdEkn6x3N0h8fftEHmzZ0tc1HPVDtejrfaOdwoDmezZP1O3b5ZuNIjQ_02q43xRKjH3b_1CWOPmF-7qJ1a3oz6jXGU3Jo9BDx7HWekPvLb3fz62Jxe3Uzv1gUHVciFf3jY9lVKEAbLcCAqOqyU72U2DNdctNLrKqqF4ZnxTBAyRWrmFRoGiYAxAn5sv-7Df73DmNqn_wuuFzZcslqEBzU5GJ7Vxd8jAFNuw121OFvy6CdMLcT5nbC3L5izpn6v0xn9-hS0HZ4N_l5n7SI-NakoGFK1OIFOayNpA
CODEN IGRSBY
CitedBy_id crossref_primary_10_3390_geomatics5010007
crossref_primary_10_3390_rs14092253
crossref_primary_10_1109_TGRS_2022_3224733
crossref_primary_10_1109_JSTARS_2023_3280365
crossref_primary_10_3390_rs14184514
crossref_primary_10_1016_j_jag_2023_103646
crossref_primary_10_3390_aerospace10100880
crossref_primary_10_1109_TGRS_2023_3243954
crossref_primary_10_3390_rs13245100
crossref_primary_10_3390_rs16071214
crossref_primary_10_1016_j_asoc_2024_112061
crossref_primary_10_1080_15481603_2024_2356355
crossref_primary_10_3390_app13179491
crossref_primary_10_1016_j_sigpro_2023_109152
crossref_primary_10_1109_JSTARS_2023_3269852
crossref_primary_10_7717_peerj_cs_1558
crossref_primary_10_1109_TIV_2022_3221767
crossref_primary_10_1016_j_neucom_2024_128784
crossref_primary_10_1109_JSTARS_2021_3078631
crossref_primary_10_3788_LOP212864
crossref_primary_10_1080_10095020_2024_2405017
crossref_primary_10_1080_17538947_2024_2341970
crossref_primary_10_3390_rs13010119
crossref_primary_10_1080_01431161_2021_1876272
crossref_primary_10_1109_LGRS_2023_3235117
crossref_primary_10_1109_TIP_2021_3120054
crossref_primary_10_1109_TCSVT_2024_3457622
crossref_primary_10_1109_JSTARS_2021_3102137
crossref_primary_10_1016_j_eswa_2024_124019
crossref_primary_10_1109_ACCESS_2021_3058571
crossref_primary_10_1109_LGRS_2020_3047443
crossref_primary_10_1109_LGRS_2024_3397851
crossref_primary_10_1016_j_dsp_2023_104339
crossref_primary_10_3390_e24121759
crossref_primary_10_1038_s41598_024_65585_1
crossref_primary_10_3390_rs15081980
crossref_primary_10_1080_01431161_2023_2285742
crossref_primary_10_1109_JSTARS_2024_3456842
crossref_primary_10_1002_mp_17628
crossref_primary_10_1109_ACCESS_2021_3111899
crossref_primary_10_1016_j_engappai_2024_108782
crossref_primary_10_1109_TGRS_2023_3339291
crossref_primary_10_1109_TETCI_2022_3182414
crossref_primary_10_1109_JSTARS_2023_3289293
crossref_primary_10_3389_fevo_2023_1201125
crossref_primary_10_3390_math11071644
crossref_primary_10_1109_TGRS_2023_3272614
crossref_primary_10_3788_LOP222250
crossref_primary_10_1016_j_eswa_2024_125779
crossref_primary_10_1109_JSTARS_2023_3310160
crossref_primary_10_3390_rs14194983
crossref_primary_10_3390_rs15010236
crossref_primary_10_1016_j_jag_2024_103661
crossref_primary_10_3390_rs15030840
crossref_primary_10_1109_LGRS_2022_3183613
crossref_primary_10_3390_rs15163975
crossref_primary_10_1177_00405175231205898
crossref_primary_10_3390_electronics12112463
crossref_primary_10_3390_rs14010102
crossref_primary_10_1109_JSTARS_2023_3244209
crossref_primary_10_3390_rs13163196
crossref_primary_10_1109_ACCESS_2024_3355154
crossref_primary_10_1109_TGRS_2022_3168697
crossref_primary_10_1109_LGRS_2021_3116601
crossref_primary_10_3390_agriculture12101543
crossref_primary_10_1109_TGRS_2023_3302024
crossref_primary_10_1371_journal_pone_0301134
crossref_primary_10_3390_ijgi10100672
crossref_primary_10_1109_LGRS_2022_3145499
crossref_primary_10_1109_TGRS_2024_3379669
crossref_primary_10_1109_LGRS_2023_3233979
crossref_primary_10_1109_ACCESS_2024_3451153
crossref_primary_10_3390_rs15215148
crossref_primary_10_1109_TGRS_2021_3085889
crossref_primary_10_1109_TGRS_2021_3103517
crossref_primary_10_3390_rs13152986
crossref_primary_10_3390_rs15040927
crossref_primary_10_1016_j_jag_2021_102515
crossref_primary_10_1109_JSTARS_2022_3205609
crossref_primary_10_1016_j_isprsjprs_2024_04_018
crossref_primary_10_1109_ACCESS_2021_3122162
crossref_primary_10_3390_rs14040818
crossref_primary_10_3390_s23146295
crossref_primary_10_3390_app14177499
crossref_primary_10_1016_j_patrec_2022_04_037
crossref_primary_10_1109_TAI_2024_3363685
crossref_primary_10_1109_LGRS_2023_3307240
crossref_primary_10_1080_01431161_2024_2338232
crossref_primary_10_3390_app14041439
crossref_primary_10_1016_j_compbiomed_2024_108500
crossref_primary_10_1109_JSTARS_2023_3331444
crossref_primary_10_1109_ACCESS_2021_3065695
crossref_primary_10_1038_s41598_025_85125_9
crossref_primary_10_3389_fnins_2024_1363930
crossref_primary_10_1016_j_aej_2024_03_035
crossref_primary_10_1109_JSTARS_2024_3470316
crossref_primary_10_1109_LGRS_2024_3507033
crossref_primary_10_1109_TGRS_2023_3338699
crossref_primary_10_3390_app112110208
crossref_primary_10_1109_JSTARS_2022_3175191
crossref_primary_10_1109_JSTARS_2023_3328559
crossref_primary_10_1109_TGRS_2023_3314641
crossref_primary_10_1109_TIM_2024_3374318
crossref_primary_10_1109_JSTARS_2024_3355943
crossref_primary_10_1016_j_compag_2025_109973
crossref_primary_10_3390_rs14030533
crossref_primary_10_3390_ijgi11070385
crossref_primary_10_3390_rs14030498
crossref_primary_10_3390_rs14030531
crossref_primary_10_3390_rs16173334
crossref_primary_10_1109_TGRS_2021_3076050
crossref_primary_10_3390_rs15123121
crossref_primary_10_1038_s41598_024_72996_7
crossref_primary_10_1109_LGRS_2023_3278448
crossref_primary_10_3390_rs14205175
crossref_primary_10_1109_LGRS_2021_3065039
crossref_primary_10_1109_TGRS_2024_3479190
crossref_primary_10_1109_JSTARS_2023_3335891
crossref_primary_10_1109_TGRS_2025_3531879
crossref_primary_10_3390_su142214723
crossref_primary_10_3390_rs17030402
crossref_primary_10_1038_s41598_024_76622_4
crossref_primary_10_3390_rs14071636
crossref_primary_10_1016_j_eswa_2023_122299
crossref_primary_10_3390_agriculture12081284
crossref_primary_10_1080_27669645_2023_2202961
crossref_primary_10_1016_j_engappai_2023_107638
crossref_primary_10_3390_rs14133109
crossref_primary_10_1080_01431161_2024_2349266
crossref_primary_10_1016_j_neucom_2022_12_004
crossref_primary_10_1109_ACCESS_2023_3320792
crossref_primary_10_1109_TII_2020_3022912
crossref_primary_10_3390_s23021023
crossref_primary_10_1155_2022_8517706
crossref_primary_10_3390_rs14246193
crossref_primary_10_3390_rs15010039
crossref_primary_10_1017_S0263574722001059
crossref_primary_10_3390_electronics13163112
crossref_primary_10_3390_rs14194941
crossref_primary_10_1109_LGRS_2022_3222836
crossref_primary_10_3390_rs13173504
crossref_primary_10_1117_1_JRS_16_044520
crossref_primary_10_1007_s10553_025_01825_y
crossref_primary_10_1109_TETCI_2020_3045485
crossref_primary_10_3233_IDT_240773
crossref_primary_10_1109_JSTARS_2024_3447086
crossref_primary_10_1109_LGRS_2023_3336061
crossref_primary_10_1016_j_compbiomed_2023_107300
crossref_primary_10_1108_IJICC_03_2023_0053
crossref_primary_10_1080_10106049_2024_2311217
crossref_primary_10_1049_ell2_70014
crossref_primary_10_4018_IJSWIS_333712
crossref_primary_10_1109_TGRS_2023_3292112
crossref_primary_10_1016_j_inffus_2025_102960
crossref_primary_10_3390_rs13224518
crossref_primary_10_3390_s25051394
crossref_primary_10_3390_rs14092263
crossref_primary_10_3390_rs14194770
crossref_primary_10_4018_IJWLTT_335115
crossref_primary_10_3390_electronics12051215
crossref_primary_10_3390_rs15051328
crossref_primary_10_1016_j_bspc_2024_107456
crossref_primary_10_3390_rs15235610
crossref_primary_10_1109_TGRS_2023_3336285
crossref_primary_10_1109_TGRS_2022_3153679
crossref_primary_10_3390_rs14164065
crossref_primary_10_1016_j_isprsjprs_2022_06_008
crossref_primary_10_3390_rs16162930
crossref_primary_10_1049_ell2_13305
crossref_primary_10_1109_TGRS_2023_3276172
crossref_primary_10_1109_TMM_2022_3197369
crossref_primary_10_3390_info13050259
crossref_primary_10_1109_LSP_2024_3398358
crossref_primary_10_1007_s00371_024_03419_x
crossref_primary_10_1109_TGRS_2023_3268159
crossref_primary_10_1109_TGRS_2024_3477548
Cites_doi 10.1007/s11263-009-0275-4
10.1080/01431160903439882
10.1109/CVPR.2015.7298965
10.1016/j.isprsjprs.2019.10.001
10.1109/CVPR.2017.549
10.1109/ACCESS.2019.2917952
10.1109/IGARSS.2019.8900224
10.1109/ICCV.2019.00069
10.3390/ijgi7030110
10.3390/rs9060522
10.1109/TPAMI.2016.2644615
10.1109/IGARSS.2019.8900281
10.1109/TPAMI.2011.208
10.1109/CVPR.2016.350
10.1109/CVPR.2017.518
10.1109/IGARSS.2019.8899217
10.1109/CVPR.2019.00326
ContentType Journal Article
Copyright Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2021
Copyright_xml – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2021
DBID 97E
RIA
RIE
AAYXX
CITATION
7SC
7SP
7TG
7UA
8FD
C1K
F1W
FR3
H8D
H96
JQ2
KL.
KR7
L.G
L7M
L~C
L~D
DOI 10.1109/LGRS.2020.2988294
DatabaseName IEEE All-Society Periodicals Package (ASPP) 2005–Present
IEEE All-Society Periodicals Package (ASPP) 1998–Present
IEEE Xplore
CrossRef
Computer and Information Systems Abstracts
Electronics & Communications Abstracts
Meteorological & Geoastrophysical Abstracts
Water Resources Abstracts
Technology Research Database
Environmental Sciences and Pollution Management
ASFA: Aquatic Sciences and Fisheries Abstracts
Engineering Research Database
Aerospace Database
Aquatic Science & Fisheries Abstracts (ASFA) 2: Ocean Technology, Policy & Non-Living Resources
ProQuest Computer Science Collection
Meteorological & Geoastrophysical Abstracts - Academic
Civil Engineering Abstracts
Aquatic Science & Fisheries Abstracts (ASFA) Professional
Advanced Technologies Database with Aerospace
Computer and Information Systems Abstracts – Academic
Computer and Information Systems Abstracts Professional
DatabaseTitle CrossRef
Civil Engineering Abstracts
Aquatic Science & Fisheries Abstracts (ASFA) Professional
Technology Research Database
Computer and Information Systems Abstracts – Academic
Electronics & Communications Abstracts
ProQuest Computer Science Collection
Computer and Information Systems Abstracts
Water Resources Abstracts
Environmental Sciences and Pollution Management
Computer and Information Systems Abstracts Professional
Aerospace Database
Meteorological & Geoastrophysical Abstracts
Aquatic Science & Fisheries Abstracts (ASFA) 2: Ocean Technology, Policy & Non-Living Resources
ASFA: Aquatic Sciences and Fisheries Abstracts
Engineering Research Database
Advanced Technologies Database with Aerospace
Meteorological & Geoastrophysical Abstracts - Academic
DatabaseTitleList
Civil Engineering Abstracts
Database_xml – sequence: 1
  dbid: RIE
  name: IEEE Electronic Library (IEL)
  url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/
  sourceTypes: Publisher
DeliveryMethod fulltext_linktorsrc
Discipline Geography
Geology
EISSN 1558-0571
EndPage 909
ExternalDocumentID 10_1109_LGRS_2020_2988294
9081937
Genre orig-research
GrantInformation_xml – fundername: National Natural Science Foundation of China
  grantid: 41871276; 41771458; 41861048; 41871364; 41871302
  funderid: 10.13039/501100001809
GroupedDBID 0R~
29I
4.4
5GY
5VS
6IK
97E
AAJGR
AARMG
AASAJ
AAWTH
ABAZT
ABQJQ
ABVLG
ACGFO
ACIWK
AENEX
AETIX
AFRAH
AGQYO
AGSQL
AHBIQ
AIBXA
AKJIK
AKQYR
ALMA_UNASSIGNED_HOLDINGS
ATWAV
BEFXN
BFFAM
BGNUA
BKEBE
BPEOZ
CS3
EBS
EJD
HZ~
H~9
IFIPE
IPLJI
JAVBF
LAI
M43
O9-
OCL
P2P
RIA
RIE
RNS
~02
AAYXX
CITATION
RIG
7SC
7SP
7TG
7UA
8FD
C1K
F1W
FR3
H8D
H96
JQ2
KL.
KR7
L.G
L7M
L~C
L~D
ID FETCH-LOGICAL-c293t-dbb4c6e30afa30f03674c9d55ed1a42fd5e666d3f2367f10e52916159ef813003
IEDL.DBID RIE
ISSN 1545-598X
IngestDate Mon Jun 30 08:32:35 EDT 2025
Tue Jul 01 03:45:43 EDT 2025
Thu Apr 24 23:07:50 EDT 2025
Wed Aug 27 02:30:55 EDT 2025
IsPeerReviewed true
IsScholarly true
Issue 5
Language English
License https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c293t-dbb4c6e30afa30f03674c9d55ed1a42fd5e666d3f2367f10e52916159ef813003
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ORCID 0000-0003-1173-6593
0000-0002-4761-5913
0000-0003-0071-310X
PQID 2517032090
PQPubID 75725
PageCount 5
ParticipantIDs ieee_primary_9081937
proquest_journals_2517032090
crossref_primary_10_1109_LGRS_2020_2988294
crossref_citationtrail_10_1109_LGRS_2020_2988294
ProviderPackageCode CITATION
AAYXX
PublicationCentury 2000
PublicationDate 2021-05-01
PublicationDateYYYYMMDD 2021-05-01
PublicationDate_xml – month: 05
  year: 2021
  text: 2021-05-01
  day: 01
PublicationDecade 2020
PublicationPlace Piscataway
PublicationPlace_xml – name: Piscataway
PublicationTitle IEEE geoscience and remote sensing letters
PublicationTitleAbbrev LGRS
PublicationYear 2021
Publisher IEEE
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Publisher_xml – name: IEEE
– name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
References ref13
ref24
ref12
ref23
li (ref14) 2018
ronneberger (ref3) 2015
ref20
oktay (ref15) 2018
ref22
ref10
ref2
ref1
ref17
woo (ref18) 2018
ref16
audebert (ref11) 2016
ref8
ref7
ref9
ref4
ref6
ref5
he (ref19) 2016
chen (ref21) 2018
References_xml – ident: ref6
  doi: 10.1007/s11263-009-0275-4
– ident: ref1
  doi: 10.1080/01431160903439882
– start-page: 630
  year: 2016
  ident: ref19
  article-title: Identity mappings in deep residual networks
  publication-title: Proc Eur Conf Comput Vis
– year: 2018
  ident: ref15
  article-title: Attention U-Net: Learning where to look for the pancreas
  publication-title: arXiv 1804 03999
– ident: ref4
  doi: 10.1109/CVPR.2015.7298965
– ident: ref13
  doi: 10.1016/j.isprsjprs.2019.10.001
– year: 2018
  ident: ref14
  article-title: Pyramid attention network for semantic segmentation
  publication-title: arXiv 1805 10180
– ident: ref20
  doi: 10.1109/CVPR.2017.549
– ident: ref8
  doi: 10.1109/ACCESS.2019.2917952
– ident: ref12
  doi: 10.1109/IGARSS.2019.8900224
– ident: ref17
  doi: 10.1109/ICCV.2019.00069
– ident: ref24
  doi: 10.3390/ijgi7030110
– start-page: 180
  year: 2016
  ident: ref11
  article-title: Semantic segmentation of earth observation data using multimodal and multi-scale deep networks
  publication-title: Proc Asian Conf Comput Vis
– start-page: 3
  year: 2018
  ident: ref18
  article-title: CBAM: Convolutional block attention module
  publication-title: Proc Eur Conf Comput Vis (ECCV)
– ident: ref23
  doi: 10.3390/rs9060522
– start-page: 234
  year: 2015
  ident: ref3
  article-title: U-Net: Convolutional networks for biomedical image segmentation
  publication-title: Proc Int Conf Med Image Comput Comput -Assist Intervent
– ident: ref5
  doi: 10.1109/TPAMI.2016.2644615
– ident: ref9
  doi: 10.1109/IGARSS.2019.8900281
– start-page: 801
  year: 2018
  ident: ref21
  article-title: Encoder-decoder with atrous separable convolution for semantic image segmentation
  publication-title: Proc Eur Conf Comput Vis (ECCV)
– ident: ref2
  doi: 10.1109/TPAMI.2011.208
– ident: ref7
  doi: 10.1109/CVPR.2016.350
– ident: ref22
  doi: 10.1109/CVPR.2017.518
– ident: ref10
  doi: 10.1109/IGARSS.2019.8899217
– ident: ref16
  doi: 10.1109/CVPR.2019.00326
SSID ssj0024887
Score 2.6351597
Snippet High-resolution remote sensing images (HRRSIs) contain substantial ground object information, such as texture, shape, and spatial location. Semantic...
SourceID proquest
crossref
ieee
SourceType Aggregation Database
Enrichment Source
Index Database
Publisher
StartPage 905
SubjectTerms Attention module
Computational modeling
convolutional neural network
Feature extraction
High resolution
Image resolution
Image segmentation
Remote sensing
Resolution
Semantic segmentation
Semantics
Task analysis
Training
Title SCAttNet: Semantic Segmentation Network With Spatial and Channel Attention Mechanism for High-Resolution Remote Sensing Images
URI https://ieeexplore.ieee.org/document/9081937
https://www.proquest.com/docview/2517032090
Volume 18
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1Nb9QwEB21lRBc-GhBLBTkAydEtk7iZGNuVUVbEN1Dl4q9RYk9blewKWK9h3Lob--M492KghCXyAfbsTITz9jz5g3AG6eZ0wjTJCvpoUajPNEKq8TorHXSpGXTBrTFuDw-U5-mxXQD3q1zYRAxgM9wyM0Qy7eXZslXZXua7Vc-2oRNOrj1uVq3vHpVKIbHHkFS6GoaI5ip1Hufj04ndBLM5DDT5FBq9ZsNCkVV_tiJg3k5fAQnq4X1qJJvw6Vvh-bXHc7G_135Y3gY_Uyx3yvGE9jAbhvux5LnF1fbcO8o1PS92oHrycG-92P078UE5_SlZ4Ya5_OYldSJcQ8VF19n_kJwDWPSWdF0VnBqQof0Fu971KQ4Qc4kni3mgpxhwSCShAMEvXqLUyTFQJq84xsK8XFOm9niKZwdfvhycJzEsgyJId_AJ7ZtlSkxl41rcunIBI6U0bYo0KaNypwtkM5ENndMDudSiUWm2a_U6CoOnuXPYKu77PA5CGbXt5lDWblKuaJqVclsNoYmamjnswOQK0HVJnKWc-mM73U4u0hds2xrlm0dZTuAt-shP3rCjn913mFZrTtGMQ1gd6UNdfylFzVzu3G1eS1f_H3US3iQMeAloCF3Ycv_XOIr8lh8-zqo6g17--dV
linkProvider IEEE
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1Lb9QwEB6VIlQuPFoqFgr4wAmRreM42ZhbVdFuYXcP3VbsLUrsMV3Bpoj1HsqB347H8S7iIcQl8sF2rMzEM_Z88w3AS6uI0wjTRBT-IQeDLFESy0Qr0Viu06JuAtpiUgwv5btZPtuC15tcGEQM4DPsUzPE8s21XtFV2aEi-5UNbsFtb_dz0WVr_WTWK0M5PPIJklyVsxjDTLk6HJ2eT_1ZUPC-UN6lVPIXKxTKqvyxFwcDc3IfxuuldbiST_2Va_r622-sjf-79gdwL3qa7KhTjYewhe0u7MSi51c3u3DnNFT1vdmD79PjI-cm6N6wKS78t55r3_i4iHlJLZt0YHH2Ye6uGFUx9lrL6tYwSk5o0b_FuQ43ycZIucTz5YJ5d5gRjCShEEGn4OwcvWqgn7ylOwp2tvDb2fIRXJ68vTgeJrEwQ6K9d-AS0zRSF5jx2tYZt94IDqRWJs_RpLUU1uToT0Ums0QPZ1OOuVDkWSq0JYXPsn3Ybq9bfAyM-PWNsMhLW0qbl40siM9G-4lqv_eZHvC1oCodWcupeMbnKpxeuKpIthXJtoqy7cGrzZAvHWXHvzrvkaw2HaOYenCw1oYq_tTLitjdqN684k_-PuoF7AwvxqNqdDZ5_xTuCoK_BGzkAWy7ryt85v0X1zwPavsDthDqnw
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=SCAttNet%3A+Semantic+Segmentation+Network+With+Spatial+and+Channel+Attention+Mechanism+for+High-Resolution+Remote+Sensing+Images&rft.jtitle=IEEE+geoscience+and+remote+sensing+letters&rft.au=Li%2C+Haifeng&rft.au=Qiu%2C+Kaijian&rft.au=Chen%2C+Li&rft.au=Mei%2C+Xiaoming&rft.date=2021-05-01&rft.pub=The+Institute+of+Electrical+and+Electronics+Engineers%2C+Inc.+%28IEEE%29&rft.issn=1545-598X&rft.eissn=1558-0571&rft.volume=18&rft.issue=5&rft.spage=905&rft_id=info:doi/10.1109%2FLGRS.2020.2988294&rft.externalDBID=NO_FULL_TEXT
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1545-598X&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1545-598X&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1545-598X&client=summon