ResUNet-a: A deep learning framework for semantic segmentation of remotely sensed data

Scene understanding of high resolution aerial images is of great importance for the task of automated monitoring in various remote sensing applications. Due to the large within-class and small between-class variance in pixel values of objects of interest, this remains a challenging task. In recent y...

Full description

Saved in:
Bibliographic Details
Published inISPRS journal of photogrammetry and remote sensing Vol. 162; pp. 94 - 114
Main Authors Diakogiannis, Foivos I., Waldner, François, Caccetta, Peter, Wu, Chen
Format Journal Article
LanguageEnglish
Published Elsevier B.V 01.04.2020
Subjects
Online AccessGet full text

Cover

Loading…
Abstract Scene understanding of high resolution aerial images is of great importance for the task of automated monitoring in various remote sensing applications. Due to the large within-class and small between-class variance in pixel values of objects of interest, this remains a challenging task. In recent years, deep convolutional neural networks have started being used in remote sensing applications and demonstrate state of the art performance for pixel level classification of objects. Here we propose a reliable framework for performant results for the task of semantic segmentation of monotemporal very high resolution aerial images. Our framework consists of a novel deep learning architecture, ResUNet-a, and a novel loss function based on the Dice loss. ResUNet-a uses a UNet encoder/decoder backbone, in combination with residual connections, atrous convolutions, pyramid scene parsing pooling and multi-tasking inference. ResUNet-a infers sequentially the boundary of the objects, the distance transform of the segmentation mask, the segmentation mask and a colored reconstruction of the input. Each of the tasks is conditioned on the inference of the previous ones, thus establishing a conditioned relationship between the various tasks, as this is described through the architecture’s computation graph. We analyse the performance of several flavours of the Generalized Dice loss for semantic segmentation, and we introduce a novel variant loss function for semantic segmentation of objects that has excellent convergence properties and behaves well even under the presence of highly imbalanced classes. The performance of our modeling framework is evaluated on the ISPRS 2D Potsdam dataset. Results show state-of-the-art performance with an average F1 score of 92.9% over all classes for our best model.
AbstractList Scene understanding of high resolution aerial images is of great importance for the task of automated monitoring in various remote sensing applications. Due to the large within-class and small between-class variance in pixel values of objects of interest, this remains a challenging task. In recent years, deep convolutional neural networks have started being used in remote sensing applications and demonstrate state of the art performance for pixel level classification of objects. Here we propose a reliable framework for performant results for the task of semantic segmentation of monotemporal very high resolution aerial images. Our framework consists of a novel deep learning architecture, ResUNet-a, and a novel loss function based on the Dice loss. ResUNet-a uses a UNet encoder/decoder backbone, in combination with residual connections, atrous convolutions, pyramid scene parsing pooling and multi-tasking inference. ResUNet-a infers sequentially the boundary of the objects, the distance transform of the segmentation mask, the segmentation mask and a colored reconstruction of the input. Each of the tasks is conditioned on the inference of the previous ones, thus establishing a conditioned relationship between the various tasks, as this is described through the architecture’s computation graph. We analyse the performance of several flavours of the Generalized Dice loss for semantic segmentation, and we introduce a novel variant loss function for semantic segmentation of objects that has excellent convergence properties and behaves well even under the presence of highly imbalanced classes. The performance of our modeling framework is evaluated on the ISPRS 2D Potsdam dataset. Results show state-of-the-art performance with an average F1 score of 92.9% over all classes for our best model.
Author Caccetta, Peter
Waldner, François
Diakogiannis, Foivos I.
Wu, Chen
Author_xml – sequence: 1
  givenname: Foivos I.
  surname: Diakogiannis
  fullname: Diakogiannis, Foivos I.
  email: foivos.diakogiannis@data61.csiro.au
  organization: Data61, CSIRO, Floreat, WA, Australia
– sequence: 2
  givenname: François
  surname: Waldner
  fullname: Waldner, François
  organization: CSIRO Agriculture & Food, St Lucia, QLD, Australia
– sequence: 3
  givenname: Peter
  surname: Caccetta
  fullname: Caccetta, Peter
  organization: Data61, CSIRO, Floreat, WA, Australia
– sequence: 4
  givenname: Chen
  surname: Wu
  fullname: Wu, Chen
  organization: ICRAR, The University of Western Australia, Crawley, WA, Australia
BookMark eNqNUE1LAzEQDVLBtvobzNHL1kl2t4mCh1L8AlEQ9Rqy2VlJ3U1qkir-eyMVD14U3jDDzHsP5k3IyHmHhBwymDFg8-PVzMZ1iKtcMw4cZsAyyh0yZlLwQvKyHpExnPCq4ILN98gkxhUAsHoux-TpHuPjLaZCn9IFbRHXtEcdnHXPtAt6wHcfXmjnA404aJesycPzgC7pZL2jvqMBB5-w_8gHF7GlrU56n-x2uo948N2n5PHi_GF5VdzcXV4vFzeFKSuZCjS1ZgyFaDuJDczRtAw5r0XVMd7xRui8FFhXrWwYQmZzU8qGM2hkVQldTsnR1ncd_OsGY1KDjQb7Xjv0m6h4BVBzkPIkU8WWaoKPMWCn1sEOOnwoBuorSbVSP0mqryQVsIwyK89-KY3dvp-Ctv0_9IutHnMSbxaDisaiM9jagCap1ts_PT4BOPqZKQ
CitedBy_id crossref_primary_10_3390_aerospace10100880
crossref_primary_10_1016_j_compbiomed_2024_108305
crossref_primary_10_1016_j_jag_2021_102557
crossref_primary_10_1007_s11517_025_03342_w
crossref_primary_10_1142_S0219519423400699
crossref_primary_10_1016_j_pacs_2020_100218
crossref_primary_10_1007_s00521_023_08768_7
crossref_primary_10_1016_j_isprsjprs_2022_08_008
crossref_primary_10_1109_ACCESS_2024_3402818
crossref_primary_10_1016_j_isprsjprs_2023_11_015
crossref_primary_10_1016_j_rse_2023_113856
crossref_primary_10_1109_JSTARS_2021_3102137
crossref_primary_10_3390_rs15071931
crossref_primary_10_1016_j_exer_2021_108844
crossref_primary_10_3390_en13071772
crossref_primary_10_1007_s00371_024_03762_z
crossref_primary_10_1038_s41597_024_03990_x
crossref_primary_10_1016_j_cag_2020_05_003
crossref_primary_10_1016_j_isprsjprs_2022_08_019
crossref_primary_10_1016_j_displa_2024_102929
crossref_primary_10_1016_j_jksuci_2023_101793
crossref_primary_10_1016_j_media_2022_102642
crossref_primary_10_3390_rs13020294
crossref_primary_10_1016_j_media_2023_102957
crossref_primary_10_17660_ActaHortic_2023_1360_7
crossref_primary_10_1038_s41598_021_90599_4
crossref_primary_10_3390_rs17030422
crossref_primary_10_1007_s10489_023_04766_3
crossref_primary_10_1109_LSP_2025_3541989
crossref_primary_10_1093_mnras_stad1375
crossref_primary_10_1109_LGRS_2022_3142279
crossref_primary_10_12677_acm_2025_152503
crossref_primary_10_1080_09540091_2020_1862059
crossref_primary_10_1109_TGRS_2023_3344150
crossref_primary_10_1007_s12273_021_0872_x
crossref_primary_10_1016_j_jag_2024_104113
crossref_primary_10_1007_s10462_023_10621_1
crossref_primary_10_3390_electronics13224503
crossref_primary_10_1007_s12524_023_01793_y
crossref_primary_10_1016_j_buildenv_2021_107949
crossref_primary_10_1371_journal_pone_0301134
crossref_primary_10_1016_j_isprsjprs_2023_02_006
crossref_primary_10_1016_j_compag_2022_106873
crossref_primary_10_1016_j_health_2023_100259
crossref_primary_10_3390_math12050765
crossref_primary_10_1117_1_JRS_16_038504
crossref_primary_10_1049_ipr2_13052
crossref_primary_10_3390_rs13152986
crossref_primary_10_1080_14498596_2024_2305124
crossref_primary_10_1109_ACCESS_2021_3069882
crossref_primary_10_3390_rs13234902
crossref_primary_10_3390_rs14225862
crossref_primary_10_1016_j_ibmed_2023_100092
crossref_primary_10_1109_LGRS_2025_3532334
crossref_primary_10_1109_ACCESS_2024_3365577
crossref_primary_10_1080_24699322_2024_2329675
crossref_primary_10_1007_s10921_023_01019_8
crossref_primary_10_3390_rs16050839
crossref_primary_10_1109_JSTARS_2021_3071353
crossref_primary_10_1109_TGRS_2023_3326500
crossref_primary_10_1109_TGRS_2024_3393489
crossref_primary_10_3390_rs13020239
crossref_primary_10_3390_rs13152903
crossref_primary_10_1155_2022_7733860
crossref_primary_10_3390_ijgi12110454
crossref_primary_10_1007_s13735_023_00293_6
crossref_primary_10_1016_j_neunet_2025_107386
crossref_primary_10_1080_10106049_2024_2302176
crossref_primary_10_1007_s11517_023_02824_z
crossref_primary_10_1007_s40747_024_01735_2
crossref_primary_10_1109_ACCESS_2024_3351564
crossref_primary_10_3390_rs17030496
crossref_primary_10_1093_mnras_stad563
crossref_primary_10_1016_j_neucom_2023_126946
crossref_primary_10_1109_ACCESS_2024_3392947
crossref_primary_10_1145_3613453
crossref_primary_10_3390_rs15163934
crossref_primary_10_1016_j_landurbplan_2022_104671
crossref_primary_10_1109_TGRS_2024_3443420
crossref_primary_10_1016_j_jenvman_2022_115732
crossref_primary_10_1007_s00330_022_09303_3
crossref_primary_10_1016_j_isprsjprs_2023_04_019
crossref_primary_10_1109_TGRS_2024_3386934
crossref_primary_10_1016_j_engappai_2023_106669
crossref_primary_10_3390_app12041953
crossref_primary_10_3390_s24144684
crossref_primary_10_3390_land10111266
crossref_primary_10_3390_rs14040992
crossref_primary_10_1016_j_compbiomed_2020_104077
crossref_primary_10_1109_TGRS_2024_3421654
crossref_primary_10_1016_j_jag_2023_103180
crossref_primary_10_1007_s12524_024_02063_1
crossref_primary_10_1016_j_media_2022_102628
crossref_primary_10_1145_3716370
crossref_primary_10_1016_j_compbiomed_2024_109223
crossref_primary_10_1007_s12145_022_00870_z
crossref_primary_10_1109_TGRS_2021_3104807
crossref_primary_10_3390_rs15102585
crossref_primary_10_1515_revneuro_2023_0115
crossref_primary_10_1109_JSTARS_2022_3165005
crossref_primary_10_3390_rs12040644
crossref_primary_10_3390_s23187724
crossref_primary_10_1016_j_culher_2023_11_005
crossref_primary_10_1109_JSTARS_2022_3230149
crossref_primary_10_1109_TGRS_2024_3466909
crossref_primary_10_1109_JBHI_2024_3500026
crossref_primary_10_1109_TGRS_2023_3252544
crossref_primary_10_3390_app11188340
crossref_primary_10_1080_15481603_2024_2356355
crossref_primary_10_1016_j_displa_2024_102779
crossref_primary_10_3390_bioengineering11121302
crossref_primary_10_3934_mbe_2023747
crossref_primary_10_1139_geomat_2021_0013
crossref_primary_10_1109_ACCESS_2024_3483661
crossref_primary_10_1371_journal_pone_0307777
crossref_primary_10_3390_rs15051229
crossref_primary_10_1109_JSTARS_2025_3531505
crossref_primary_10_1007_s11263_024_02252_y
crossref_primary_10_1016_j_jag_2022_102685
crossref_primary_10_1002_ima_22802
crossref_primary_10_1190_geo2023_0133_1
crossref_primary_10_1109_TGRS_2025_3532248
crossref_primary_10_1016_j_compag_2023_107642
crossref_primary_10_1016_j_bspc_2024_107363
crossref_primary_10_1016_j_cmpb_2023_107408
crossref_primary_10_1080_01431161_2023_2274318
crossref_primary_10_1117_1_JMI_11_3_034504
crossref_primary_10_1109_TGRS_2023_3329152
crossref_primary_10_1016_j_compind_2023_103889
crossref_primary_10_1080_15481603_2024_2395700
crossref_primary_10_1016_j_bspc_2024_107127
crossref_primary_10_1063_5_0172146
crossref_primary_10_1007_s10278_025_01387_9
crossref_primary_10_3390_rs12020207
crossref_primary_10_1016_j_isprsjprs_2021_01_020
crossref_primary_10_1088_2057_1976_ad1663
crossref_primary_10_1007_s00521_021_06564_9
crossref_primary_10_1007_s13748_024_00340_1
crossref_primary_10_1016_j_measurement_2024_116219
crossref_primary_10_1109_ACCESS_2022_3193248
crossref_primary_10_1007_s42835_023_01479_y
crossref_primary_10_1016_j_sedgeo_2023_106570
crossref_primary_10_1109_JBHI_2024_3504829
crossref_primary_10_3390_rs13193890
crossref_primary_10_1038_s41598_023_38974_1
crossref_primary_10_1155_2024_5575787
crossref_primary_10_1109_JSTARS_2024_3371427
crossref_primary_10_1186_s13244_023_01531_5
crossref_primary_10_3390_app13137966
crossref_primary_10_1109_TGRS_2023_3324706
crossref_primary_10_1007_s11042_025_20676_7
crossref_primary_10_1088_2057_1976_ad8c47
crossref_primary_10_3390_s21051873
crossref_primary_10_1109_TGRS_2023_3251659
crossref_primary_10_1080_17538947_2022_2159080
crossref_primary_10_1109_ACCESS_2021_3065695
crossref_primary_10_1109_TGRS_2024_3487643
crossref_primary_10_1109_TIP_2022_3189825
crossref_primary_10_3390_s24237730
crossref_primary_10_1016_j_bspc_2023_104631
crossref_primary_10_1038_s41598_020_69187_5
crossref_primary_10_1007_s11517_024_03025_y
crossref_primary_10_1007_s11517_022_02723_9
crossref_primary_10_1016_j_bspc_2022_103896
crossref_primary_10_1109_TGRS_2024_3453868
crossref_primary_10_1109_ACCESS_2024_3387535
crossref_primary_10_1109_TGRS_2021_3108781
crossref_primary_10_3390_rs14030533
crossref_primary_10_1109_TMI_2022_3211764
crossref_primary_10_3390_atmos14121713
crossref_primary_10_1155_2022_9263379
crossref_primary_10_1016_j_compbiomed_2025_110012
crossref_primary_10_1016_j_jag_2022_102881
crossref_primary_10_1016_j_pacs_2022_100429
crossref_primary_10_1016_j_compag_2024_109050
crossref_primary_10_1109_JBHI_2024_3485767
crossref_primary_10_1016_j_compag_2024_109042
crossref_primary_10_1016_j_bspc_2023_105707
crossref_primary_10_3390_rs13061176
crossref_primary_10_1109_JSEN_2023_3324668
crossref_primary_10_1109_TGRS_2024_3483283
crossref_primary_10_3390_rs17030402
crossref_primary_10_1016_j_measurement_2024_115595
crossref_primary_10_1109_TGRS_2025_3538829
crossref_primary_10_1016_j_rse_2023_113495
crossref_primary_10_1016_j_rsase_2024_101188
crossref_primary_10_1109_JSTARS_2024_3424831
crossref_primary_10_3390_jcm10204760
crossref_primary_10_11648_j_ajcst_20240704_16
crossref_primary_10_3390_data8120185
crossref_primary_10_1016_j_bspc_2024_107165
crossref_primary_10_1016_j_measurement_2024_115333
crossref_primary_10_1109_TGRS_2024_3369310
crossref_primary_10_1109_JSTARS_2025_3527213
crossref_primary_10_3390_s24082552
crossref_primary_10_3390_rs13193836
crossref_primary_10_1109_ACCESS_2024_3392901
crossref_primary_10_1007_s10554_021_02482_y
crossref_primary_10_1186_s40537_021_00414_0
crossref_primary_10_1109_JSTARS_2021_3063788
crossref_primary_10_3390_s23094444
crossref_primary_10_3390_app132312658
crossref_primary_10_1016_j_media_2025_103539
crossref_primary_10_12677_CSA_2022_1212283
crossref_primary_10_1109_TGRS_2021_3131331
crossref_primary_10_1016_j_compbiomed_2022_106207
crossref_primary_10_1002_mp_16646
crossref_primary_10_1016_j_eij_2024_100446
crossref_primary_10_1167_tvst_12_5_7
crossref_primary_10_1016_j_jag_2024_104160
crossref_primary_10_1109_TMM_2022_3209924
crossref_primary_10_1016_j_aej_2023_02_039
crossref_primary_10_3390_app14041634
crossref_primary_10_12677_csa_2024_1412244
crossref_primary_10_1007_s11356_023_30843_8
crossref_primary_10_1007_s10278_025_01455_0
crossref_primary_10_1109_JSTARS_2023_3314866
crossref_primary_10_1016_j_ufug_2021_127241
crossref_primary_10_1109_TGRS_2023_3306891
crossref_primary_10_1177_08953996241290326
crossref_primary_10_1109_TMI_2023_3247543
crossref_primary_10_1371_journal_pone_0256830
crossref_primary_10_1080_17538947_2023_2246436
crossref_primary_10_1007_s41064_023_00247_x
crossref_primary_10_1007_s00371_025_03809_9
crossref_primary_10_3390_electronics13183771
crossref_primary_10_3390_rs14010207
crossref_primary_10_1109_LGRS_2023_3336680
crossref_primary_10_1109_TGRS_2024_3503588
crossref_primary_10_1109_LGRS_2023_3302432
crossref_primary_10_3390_app14062226
crossref_primary_10_1016_j_rineng_2024_103164
crossref_primary_10_1109_ACCESS_2024_3379142
crossref_primary_10_1109_ACCESS_2024_3451304
crossref_primary_10_1109_TITS_2023_3334266
crossref_primary_10_1109_JSTARS_2023_3255553
crossref_primary_10_1016_j_advengsoft_2025_103872
crossref_primary_10_1016_j_rse_2024_114207
crossref_primary_10_1109_TGRS_2023_3235002
crossref_primary_10_1109_TIM_2023_3243680
crossref_primary_10_1109_TGRS_2024_3392737
crossref_primary_10_3389_fphys_2022_1061911
crossref_primary_10_1109_ACCESS_2022_3159618
crossref_primary_10_1109_ACCESS_2021_3131768
crossref_primary_10_1186_s12880_024_01401_6
crossref_primary_10_1080_15481603_2023_2281142
crossref_primary_10_3390_ijgi11090462
crossref_primary_10_3389_fevo_2023_1201125
crossref_primary_10_1080_10106049_2021_1983034
crossref_primary_10_1007_s00371_023_03147_8
crossref_primary_10_1016_j_eswa_2021_115486
crossref_primary_10_1016_j_compbiomed_2023_107624
crossref_primary_10_1109_JSTARS_2023_3244207
crossref_primary_10_1109_ACCESS_2024_3355154
crossref_primary_10_1007_s11761_024_00400_3
crossref_primary_10_1109_ACCESS_2021_3097630
crossref_primary_10_1007_s41064_022_00194_z
crossref_primary_10_1016_j_isprsjprs_2020_09_025
crossref_primary_10_1109_TGRS_2023_3302024
crossref_primary_10_1080_01431161_2024_2398824
crossref_primary_10_1007_s10489_021_02542_9
crossref_primary_10_1109_TGRS_2024_3423010
crossref_primary_10_1109_TGRS_2024_3425673
crossref_primary_10_1016_j_vrih_2023_05_001
crossref_primary_10_3390_rs16234392
crossref_primary_10_1016_j_compbiomed_2023_107879
crossref_primary_10_3390_rs15245682
crossref_primary_10_3390_rs13163187
crossref_primary_10_1088_1361_6560_acf110
crossref_primary_10_7717_peerj_cs_1483
crossref_primary_10_3389_fcell_2023_1181305
crossref_primary_10_1016_j_measurement_2023_113570
crossref_primary_10_3389_fbioe_2024_1504249
crossref_primary_10_3390_rs12081233
crossref_primary_10_1109_TGRS_2024_3360701
crossref_primary_10_1016_j_rsase_2022_100898
crossref_primary_10_1038_s41746_024_01305_3
crossref_primary_10_1186_s42492_023_00141_8
crossref_primary_10_1038_s41597_025_04760_z
crossref_primary_10_1109_ACCESS_2025_3526602
crossref_primary_10_1080_01431161_2022_2142081
crossref_primary_10_1109_TGRS_2024_3469268
crossref_primary_10_1016_j_compbiomed_2024_107939
crossref_primary_10_1016_j_nicl_2022_103044
crossref_primary_10_3390_rs13163135
crossref_primary_10_3390_land12081566
crossref_primary_10_1016_j_bspc_2024_106383
crossref_primary_10_1109_TGRS_2022_3143855
crossref_primary_10_1016_j_cag_2023_06_018
crossref_primary_10_1016_j_isprsjprs_2024_09_017
crossref_primary_10_1109_ACCESS_2021_3052791
crossref_primary_10_3390_mca29040048
crossref_primary_10_1080_15481603_2022_2143678
crossref_primary_10_3390_rs13245015
crossref_primary_10_1007_s12555_024_0550_8
crossref_primary_10_1109_TIM_2024_3497060
crossref_primary_10_1155_2023_6460639
crossref_primary_10_1109_JSTARS_2024_3490584
crossref_primary_10_3390_rs17060995
crossref_primary_10_1016_j_bspc_2024_107200
crossref_primary_10_1038_s41598_023_36940_5
crossref_primary_10_3390_sym14061129
crossref_primary_10_3390_rs16234583
crossref_primary_10_1109_ACCESS_2022_3161507
crossref_primary_10_3390_math10152755
crossref_primary_10_1016_j_bspc_2024_107202
crossref_primary_10_1142_S0129065724500680
crossref_primary_10_1016_j_engappai_2024_109467
crossref_primary_10_1016_j_compbiomed_2022_106294
crossref_primary_10_1016_j_compag_2022_107249
crossref_primary_10_1109_ACCESS_2024_3516079
crossref_primary_10_1049_ipr2_12444
crossref_primary_10_1002_mp_14861
crossref_primary_10_1109_TGRS_2024_3426561
crossref_primary_10_1016_j_bspc_2024_107456
crossref_primary_10_1016_j_compbiomed_2024_108819
crossref_primary_10_1016_j_isprsjprs_2022_06_008
crossref_primary_10_3390_rs14194889
crossref_primary_10_3390_rs15112811
crossref_primary_10_3390_rs13163149
crossref_primary_10_3390_rs14184610
crossref_primary_10_3390_f12111441
crossref_primary_10_1190_geo2023_0182_1
crossref_primary_10_1080_01431161_2021_1982155
crossref_primary_10_1109_ACCESS_2025_3538876
crossref_primary_10_1016_j_array_2024_100346
crossref_primary_10_1080_03091902_2022_2077997
crossref_primary_10_1016_j_asoc_2024_112061
crossref_primary_10_1016_j_heliyon_2024_e26775
crossref_primary_10_1016_j_asoc_2024_112065
crossref_primary_10_1016_j_compmedimag_2022_102072
crossref_primary_10_1016_j_compbiomed_2023_107208
crossref_primary_10_1109_LGRS_2022_3215200
crossref_primary_10_1016_j_apenergy_2024_123871
crossref_primary_10_1051_e3sconf_202131004002
crossref_primary_10_1080_22797254_2023_2181874
crossref_primary_10_3390_bioengineering10080945
crossref_primary_10_3390_s24113677
crossref_primary_10_1155_2022_6010912
crossref_primary_10_1109_TGRS_2024_3493963
crossref_primary_10_1109_TGRS_2023_3334294
crossref_primary_10_3390_electronics12081829
crossref_primary_10_1109_TGRS_2023_3244136
crossref_primary_10_3390_rs15153711
crossref_primary_10_3390_bioengineering11090950
crossref_primary_10_1109_TCSVT_2024_3457622
crossref_primary_10_1109_TGRS_2024_3515157
crossref_primary_10_1016_j_scitotenv_2023_169113
crossref_primary_10_1080_01431161_2023_2243021
crossref_primary_10_1016_j_bspc_2024_106987
crossref_primary_10_1007_s00530_024_01260_9
crossref_primary_10_1016_j_compbiomed_2024_107989
crossref_primary_10_1016_j_compbiomed_2024_108838
crossref_primary_10_1016_j_engappai_2024_108782
crossref_primary_10_1109_ACCESS_2024_3520594
crossref_primary_10_1016_j_inffus_2024_102634
crossref_primary_10_1007_s10489_024_05457_3
crossref_primary_10_1016_j_asoc_2021_107445
crossref_primary_10_1016_j_renene_2023_119126
crossref_primary_10_1038_s41598_023_48553_z
crossref_primary_10_1109_ACCESS_2023_3301873
crossref_primary_10_3390_s24206655
crossref_primary_10_1007_s12524_024_01908_z
crossref_primary_10_1016_j_eswa_2024_125779
crossref_primary_10_3390_rs13112197
crossref_primary_10_1016_j_jag_2023_103265
crossref_primary_10_1016_j_semcancer_2023_09_001
crossref_primary_10_1109_TVT_2022_3168574
crossref_primary_10_1109_JSTARS_2024_3397488
crossref_primary_10_1007_s00521_023_08569_y
crossref_primary_10_1016_j_compag_2025_109953
crossref_primary_10_3389_fmars_2025_1528921
crossref_primary_10_1109_TGRS_2022_3220423
crossref_primary_10_1007_s11554_025_01639_5
crossref_primary_10_3390_rs13163119
crossref_primary_10_1364_OE_504606
crossref_primary_10_1109_ACCESS_2024_3451153
crossref_primary_10_1109_TGRS_2023_3268362
crossref_primary_10_1109_TGRS_2024_3376432
crossref_primary_10_3390_app14177928
crossref_primary_10_2139_ssrn_4151292
crossref_primary_10_1109_TIM_2022_3206810
crossref_primary_10_1155_2022_2697932
crossref_primary_10_1109_ACCESS_2021_3106377
crossref_primary_10_1016_j_engappai_2024_109890
crossref_primary_10_3390_rs14174219
crossref_primary_10_1109_TGRS_2024_3509735
crossref_primary_10_1016_j_neuri_2023_100138
crossref_primary_10_1109_JSTARS_2022_3205609
crossref_primary_10_3390_bioengineering11010016
crossref_primary_10_3390_bioengineering10080957
crossref_primary_10_1016_j_cma_2025_117848
crossref_primary_10_3389_fpls_2022_1043884
crossref_primary_10_1109_ACCESS_2025_3530701
crossref_primary_10_3390_s22145292
crossref_primary_10_1109_ACCESS_2024_3450492
crossref_primary_10_1038_s41598_024_71187_8
crossref_primary_10_3897_oneeco_7_e79160
crossref_primary_10_1109_TASLP_2024_3519881
crossref_primary_10_3390_math11204363
crossref_primary_10_1109_TGRS_2023_3338699
crossref_primary_10_1016_j_jag_2024_103717
crossref_primary_10_1111_tgis_70020
crossref_primary_10_1038_s41598_023_34943_w
crossref_primary_10_3390_rs13010039
crossref_primary_10_3934_mbe_2024090
crossref_primary_10_3390_s23167284
crossref_primary_10_3390_app10186439
crossref_primary_10_1016_j_jksuci_2022_04_011
crossref_primary_10_1002_widm_1554
crossref_primary_10_1364_BOE_513619
crossref_primary_10_1016_j_eswa_2023_121926
crossref_primary_10_3233_XST_230413
crossref_primary_10_1109_JSTARS_2024_3486906
crossref_primary_10_1109_TGRS_2020_3033816
crossref_primary_10_1016_j_ecoinf_2022_101583
crossref_primary_10_1109_ACCESS_2025_3547796
crossref_primary_10_3390_rs17020290
crossref_primary_10_3390_rs14071638
crossref_primary_10_1016_j_bspc_2021_102959
crossref_primary_10_1016_j_neucom_2025_129660
crossref_primary_10_1080_01431161_2024_2349267
crossref_primary_10_1109_ACCESS_2023_3309158
crossref_primary_10_1109_TGRS_2024_3446317
crossref_primary_10_3390_rs16010003
crossref_primary_10_1016_j_jag_2022_103146
crossref_primary_10_3390_rs13010056
crossref_primary_10_1109_JSTARS_2024_3482553
crossref_primary_10_3390_rs16152851
crossref_primary_10_1371_journal_pone_0316089
crossref_primary_10_1016_j_inffus_2024_102409
crossref_primary_10_3390_rs15071836
crossref_primary_10_1109_TGRS_2024_3404953
crossref_primary_10_3390_rs15010266
crossref_primary_10_1109_TGRS_2022_3194581
crossref_primary_10_3390_rs15235610
crossref_primary_10_1016_j_isprsjprs_2023_01_018
crossref_primary_10_1016_j_optlastec_2024_111384
crossref_primary_10_3390_cancers16061120
crossref_primary_10_3390_rs16060949
crossref_primary_10_1080_01431161_2024_2424511
crossref_primary_10_1364_OL_495624
crossref_primary_10_1109_LGRS_2022_3177778
crossref_primary_10_3390_rs16173300
crossref_primary_10_1016_j_compbiomed_2023_107290
crossref_primary_10_3390_w16101365
crossref_primary_10_1016_j_ejmp_2021_03_008
crossref_primary_10_1038_s41598_022_27352_y
crossref_primary_10_3390_rs16071214
crossref_primary_10_1016_j_compbiomed_2021_104699
crossref_primary_10_1364_OE_519379
crossref_primary_10_4236_ars_2023_123004
crossref_primary_10_1109_TGRS_2024_3363742
crossref_primary_10_1088_1361_6501_ac5439
crossref_primary_10_1016_j_jag_2022_102949
crossref_primary_10_1016_j_jag_2022_102706
crossref_primary_10_1088_1361_6560_ad1cfa
crossref_primary_10_1080_01431161_2021_1876272
crossref_primary_10_1080_15481603_2022_2076382
crossref_primary_10_1007_s11760_024_03551_0
crossref_primary_10_1007_s10346_022_01847_1
crossref_primary_10_1007_s41348_024_01050_5
crossref_primary_10_1016_j_bspc_2023_104785
crossref_primary_10_3390_rs14153562
crossref_primary_10_3390_rs15133398
crossref_primary_10_1515_jisys_2024_0185
crossref_primary_10_1109_TGRS_2022_3225144
crossref_primary_10_1109_TIM_2024_3400352
crossref_primary_10_1080_01431161_2022_2107411
crossref_primary_10_3389_fbioe_2024_1398237
crossref_primary_10_1109_ACCESS_2024_3377428
crossref_primary_10_1007_s13369_022_06734_4
crossref_primary_10_1109_JSTARS_2020_3043442
crossref_primary_10_1109_TGRS_2020_3037211
crossref_primary_10_3390_rs14174298
crossref_primary_10_1145_3592147
crossref_primary_10_3390_rs14225738
crossref_primary_10_1109_ACCESS_2024_3494816
crossref_primary_10_1109_LSP_2022_3151549
crossref_primary_10_1016_j_rse_2020_111741
crossref_primary_10_1016_j_jrmge_2024_10_027
crossref_primary_10_1007_s11227_022_04642_w
crossref_primary_10_3390_electronics11071093
crossref_primary_10_3233_XST_200735
crossref_primary_10_3390_rs14092061
crossref_primary_10_1016_j_jag_2022_102940
crossref_primary_10_1109_ACCESS_2025_3528638
crossref_primary_10_1109_ACCESS_2022_3211501
crossref_primary_10_1007_s11554_022_01225_z
crossref_primary_10_1080_15481603_2023_2196117
crossref_primary_10_1109_TIM_2024_3379418
crossref_primary_10_1109_TIM_2023_3302376
crossref_primary_10_1016_j_eswa_2020_114417
crossref_primary_10_1109_TIP_2022_3226418
crossref_primary_10_1016_j_dibe_2023_100144
crossref_primary_10_3390_s22082988
crossref_primary_10_1002_mp_17689
crossref_primary_10_1016_j_jksuci_2023_04_006
crossref_primary_10_1007_s00371_024_03774_9
crossref_primary_10_1109_JSTARS_2023_3328315
crossref_primary_10_1016_j_bspc_2024_106804
crossref_primary_10_1109_JSTARS_2023_3328559
crossref_primary_10_5194_amt_17_3029_2024
crossref_primary_10_1007_s00530_023_01173_z
crossref_primary_10_1109_ACCESS_2023_3243475
crossref_primary_10_1016_j_compmedimag_2023_102287
crossref_primary_10_1038_s41598_024_68357_z
crossref_primary_10_1016_j_bspc_2022_104004
crossref_primary_10_1002_jbio_202300447
crossref_primary_10_1007_s12145_024_01325_3
crossref_primary_10_1109_JSTARS_2024_3393531
crossref_primary_10_1016_j_commatsci_2024_113063
crossref_primary_10_1007_s00371_025_03838_4
crossref_primary_10_5194_essd_16_4817_2024
crossref_primary_10_1016_j_engappai_2023_107638
crossref_primary_10_1007_s10278_023_00942_6
crossref_primary_10_1109_TGRS_2023_3336689
crossref_primary_10_17816_DD629866
crossref_primary_10_3389_fcell_2024_1532228
crossref_primary_10_3390_jmse12060852
crossref_primary_10_1109_JSTARS_2024_3459624
crossref_primary_10_1016_j_bspc_2023_105658
crossref_primary_10_1016_j_isprsjprs_2024_05_001
crossref_primary_10_1109_TGRS_2024_3384669
crossref_primary_10_3390_jimaging8050130
crossref_primary_10_3390_s22103960
crossref_primary_10_1016_j_isprsjprs_2022_02_021
crossref_primary_10_1109_LGRS_2024_3477609
crossref_primary_10_1364_BOE_484154
crossref_primary_10_3390_app13085031
crossref_primary_10_1109_JSTARS_2024_3443283
crossref_primary_10_1016_j_knosys_2022_109552
crossref_primary_10_1109_JSTARS_2024_3435425
crossref_primary_10_3390_rs16244701
crossref_primary_10_1109_TGRS_2022_3207551
crossref_primary_10_1080_17538947_2024_2328827
crossref_primary_10_1002_acm2_14584
crossref_primary_10_1109_TGRS_2023_3264232
crossref_primary_10_1016_j_rse_2022_113014
crossref_primary_10_1109_TGRS_2024_3477548
crossref_primary_10_21122_2309_4923_2023_4_20_28
crossref_primary_10_4236_ojapps_2024_142020
crossref_primary_10_1016_j_compag_2022_107473
crossref_primary_10_1109_TGRS_2023_3243954
crossref_primary_10_61186_jgst_14_2_55
crossref_primary_10_1109_JSTARS_2023_3339642
crossref_primary_10_1080_17452759_2024_2325572
crossref_primary_10_3390_diagnostics13132274
crossref_primary_10_1016_j_isprsjprs_2022_11_014
crossref_primary_10_1016_j_isprsjprs_2022_11_012
crossref_primary_10_1016_j_asoc_2022_109695
crossref_primary_10_1016_j_bspc_2025_107773
crossref_primary_10_1364_BOE_529505
crossref_primary_10_3390_s23041887
crossref_primary_10_1088_2632_2153_aba8e8
crossref_primary_10_3390_electronics12061300
crossref_primary_10_1002_mp_16302
crossref_primary_10_1093_mam_ozae044_942
crossref_primary_10_1016_j_autcon_2025_106069
crossref_primary_10_1109_TGRS_2024_3502401
crossref_primary_10_12677_airr_2024_132037
crossref_primary_10_1109_TGRS_2023_3339291
crossref_primary_10_1016_j_compag_2024_108902
crossref_primary_10_3390_rs13183755
crossref_primary_10_1007_s10489_022_03310_z
crossref_primary_10_3390_rs14051128
crossref_primary_10_1109_JSTARS_2022_3180558
crossref_primary_10_3390_w17010068
crossref_primary_10_1007_s11760_024_03568_5
crossref_primary_10_1016_j_isprsjprs_2024_06_011
crossref_primary_10_2139_ssrn_4184451
crossref_primary_10_1109_JSTARS_2025_3540789
crossref_primary_10_1016_j_imavis_2024_105338
crossref_primary_10_3390_rs14112611
crossref_primary_10_1109_TGRS_2022_3168697
crossref_primary_10_1007_s11548_022_02822_w
crossref_primary_10_1038_s41598_024_69827_0
crossref_primary_10_1109_JBHI_2023_3264539
crossref_primary_10_1109_LGRS_2022_3145499
crossref_primary_10_1016_j_bspc_2025_107793
crossref_primary_10_3390_rs15215148
crossref_primary_10_3390_ijgi11010009
crossref_primary_10_1016_j_bspc_2024_107257
crossref_primary_10_54097_50tqkg08
crossref_primary_10_3390_rs14194722
crossref_primary_10_1016_j_jag_2024_104084
crossref_primary_10_3390_diagnostics14232761
crossref_primary_10_1109_ACCESS_2024_3468384
crossref_primary_10_1016_j_ufug_2024_128373
crossref_primary_10_1109_JBHI_2024_3468904
crossref_primary_10_1515_bmt_2024_0439
crossref_primary_10_1177_01617346221075769
crossref_primary_10_1109_ACCESS_2024_3372394
crossref_primary_10_1109_TGRS_2022_3144894
crossref_primary_10_1109_ACCESS_2020_3009976
crossref_primary_10_1007_s10586_024_04292_y
crossref_primary_10_1109_TGRS_2023_3271392
crossref_primary_10_1002_acm2_14155
crossref_primary_10_1109_TGRS_2024_3453501
crossref_primary_10_1016_j_daach_2024_e00360
crossref_primary_10_1080_17538947_2023_2230956
crossref_primary_10_1109_TGRS_2021_3128033
crossref_primary_10_1007_s00530_023_01138_2
crossref_primary_10_1007_s10278_024_01191_x
crossref_primary_10_3390_en16237726
crossref_primary_10_1364_OE_539117
crossref_primary_10_3390_rs15215123
crossref_primary_10_1109_TAI_2021_3064913
crossref_primary_10_3390_rs15215124
crossref_primary_10_1016_j_media_2023_103031
crossref_primary_10_1016_j_neucom_2022_04_045
crossref_primary_10_1111_jcmm_70315
crossref_primary_10_1007_s11042_022_14037_x
crossref_primary_10_3390_rs16132336
crossref_primary_10_3389_fbioe_2024_1414605
crossref_primary_10_1088_1475_7516_2024_09_002
crossref_primary_10_1016_j_jag_2021_102451
crossref_primary_10_1016_j_isprsjprs_2025_01_034
crossref_primary_10_1016_j_jvcir_2024_104212
crossref_primary_10_1140_epjp_s13360_024_05960_z
crossref_primary_10_1016_j_displa_2025_102993
crossref_primary_10_1002_ima_70030
crossref_primary_10_1016_j_engfracmech_2024_110149
crossref_primary_10_1109_ACCESS_2024_3506557
crossref_primary_10_3390_rs13061049
crossref_primary_10_1109_TMI_2023_3283948
crossref_primary_10_3390_rs12132159
crossref_primary_10_1109_ACCESS_2020_3030112
crossref_primary_10_3389_fdata_2022_1080715
crossref_primary_10_3390_app14156765
crossref_primary_10_3390_rs13183707
crossref_primary_10_1007_s11042_024_19197_6
crossref_primary_10_3390_app12125960
crossref_primary_10_1016_j_fri_2024_200611
crossref_primary_10_1109_TAI_2024_3366146
crossref_primary_10_3390_rs15051328
crossref_primary_10_1016_j_compbiomed_2023_107717
crossref_primary_10_1038_s41598_025_90151_8
crossref_primary_10_3390_rs14164065
crossref_primary_10_1016_j_cmpb_2022_107031
crossref_primary_10_1016_j_autcon_2022_104519
crossref_primary_10_1093_mnras_staa3344
crossref_primary_10_1016_j_bea_2022_100041
crossref_primary_10_1016_j_compag_2022_106799
crossref_primary_10_1080_17538947_2023_2300731
crossref_primary_10_1109_TCE_2024_3433432
crossref_primary_10_3390_app13158725
crossref_primary_10_1109_JSTARS_2024_3362688
crossref_primary_10_3390_rs16061096
crossref_primary_10_1109_JSTARS_2025_3525576
crossref_primary_10_1109_TGRS_2024_3376389
crossref_primary_10_3390_rs13153000
crossref_primary_10_1016_j_isprsjprs_2021_03_016
crossref_primary_10_1016_j_isprsjprs_2024_01_013
crossref_primary_10_1080_13682199_2023_2198394
crossref_primary_10_3390_s24051708
crossref_primary_10_1016_j_compmedimag_2022_102112
crossref_primary_10_1109_ACCESS_2023_3275435
crossref_primary_10_1016_j_sigpro_2023_109152
crossref_primary_10_1186_s12880_024_01194_8
crossref_primary_10_3390_rs15143619
crossref_primary_10_1007_s10278_022_00740_6
crossref_primary_10_1109_TGRS_2023_3328339
crossref_primary_10_1007_s11042_024_20584_2
crossref_primary_10_1007_s12530_023_09533_w
crossref_primary_10_1016_j_engappai_2022_105420
crossref_primary_10_1007_s11517_024_03252_3
crossref_primary_10_3390_app14104075
crossref_primary_10_1002_mp_15610
crossref_primary_10_1016_j_compmedimag_2022_102104
crossref_primary_10_1016_j_isprsjprs_2021_03_023
crossref_primary_10_1109_LGRS_2022_3227392
crossref_primary_10_1186_s40494_024_01505_w
crossref_primary_10_1016_j_asoc_2024_111918
crossref_primary_10_1016_j_cma_2023_116277
crossref_primary_10_1109_TGRS_2023_3298924
crossref_primary_10_1109_JSTARS_2024_3464691
crossref_primary_10_1016_j_cageo_2022_105196
crossref_primary_10_3390_rs12233928
crossref_primary_10_1038_s41598_025_90709_6
crossref_primary_10_1109_TGRS_2024_3425540
crossref_primary_10_1002_nbm_4794
crossref_primary_10_1371_journal_pone_0309421
crossref_primary_10_1109_LGRS_2021_3063381
crossref_primary_10_1109_ACCESS_2022_3233078
crossref_primary_10_1109_TGRS_2022_3232143
crossref_primary_10_3390_rs17050824
crossref_primary_10_3390_rs14010102
crossref_primary_10_1155_2024_5057538
crossref_primary_10_3390_rs12193270
crossref_primary_10_3788_LOP213019
crossref_primary_10_1016_j_optlaseng_2024_108585
crossref_primary_10_1109_TGRS_2021_3115569
crossref_primary_10_3390_rs15092293
crossref_primary_10_1007_s11042_023_15764_5
crossref_primary_10_1016_j_compag_2023_108547
crossref_primary_10_1016_j_imavis_2024_105055
crossref_primary_10_1093_jmicro_dfab043
crossref_primary_10_1016_j_engappai_2024_108292
crossref_primary_10_1007_s11517_023_02828_9
crossref_primary_10_1109_TGRS_2024_3377009
crossref_primary_10_1016_j_eiar_2024_107633
crossref_primary_10_5194_essd_15_3283_2023
crossref_primary_10_1109_JSTARS_2024_3439267
crossref_primary_10_1080_01431161_2021_1949069
crossref_primary_10_1016_j_bspc_2024_106205
crossref_primary_10_1007_s42979_023_02434_4
crossref_primary_10_1007_s10278_023_00890_1
crossref_primary_10_1073_pnas_2221407120
crossref_primary_10_1007_s00371_024_03722_7
crossref_primary_10_3390_rs13163065
crossref_primary_10_1007_s00521_022_07737_w
crossref_primary_10_1109_JSTARS_2022_3219724
crossref_primary_10_1007_s12559_024_10289_x
crossref_primary_10_1155_2023_9979431
crossref_primary_10_1038_s41598_025_92715_0
crossref_primary_10_3390_rs13142794
crossref_primary_10_1016_j_icte_2022_09_007
crossref_primary_10_1109_JSTARS_2024_3375313
crossref_primary_10_1109_JSTARS_2022_3214485
crossref_primary_10_1016_j_jag_2024_104093
crossref_primary_10_1016_j_mtener_2023_101348
crossref_primary_10_1016_j_isprsjprs_2021_09_005
crossref_primary_10_3390_sym16070870
crossref_primary_10_3390_s24175845
crossref_primary_10_1080_01431161_2022_2135413
crossref_primary_10_1109_JSTARS_2024_3501678
crossref_primary_10_1049_ipr2_12780
crossref_primary_10_3390_s22062330
crossref_primary_10_1080_01431161_2023_2173033
crossref_primary_10_3390_rs16234494
crossref_primary_10_1109_JSTARS_2023_3335891
crossref_primary_10_1016_j_imavis_2024_105068
crossref_primary_10_1016_j_compbiomed_2022_106148
crossref_primary_10_1109_TCSVT_2022_3227172
crossref_primary_10_1109_TGRS_2024_3367632
crossref_primary_10_4103_jmss_jmss_52_22
crossref_primary_10_1007_s12205_023_2285_0
crossref_primary_10_1016_j_addma_2024_104266
crossref_primary_10_1016_j_rsase_2024_101221
crossref_primary_10_1007_s10489_025_06408_2
crossref_primary_10_1016_j_neucom_2025_129382
crossref_primary_10_1016_j_tust_2024_105819
crossref_primary_10_1007_s41064_023_00233_3
crossref_primary_10_1109_JBHI_2024_3506829
crossref_primary_10_3390_app12063024
crossref_primary_10_1007_s00521_023_08729_0
crossref_primary_10_1007_s12524_024_01827_z
crossref_primary_10_3390_rs13163275
crossref_primary_10_1016_j_compbiomed_2023_107541
crossref_primary_10_3390_a16090419
crossref_primary_10_1080_01431161_2023_2275326
crossref_primary_10_1109_TCI_2024_3446230
crossref_primary_10_1007_s10489_021_03111_w
crossref_primary_10_3390_diagnostics13010123
crossref_primary_10_1016_j_eswa_2024_124751
crossref_primary_10_1109_TGRS_2023_3292112
crossref_primary_10_1109_TGRS_2023_3301494
crossref_primary_10_1117_1_JEI_32_5_053016
crossref_primary_10_1016_j_bspc_2024_106484
crossref_primary_10_1016_j_isprsjprs_2024_01_021
crossref_primary_10_1080_01431161_2023_2174386
crossref_primary_10_3390_s23218739
crossref_primary_10_1109_TETCI_2023_3309626
crossref_primary_10_3390_rs15082208
crossref_primary_10_3390_life12111848
crossref_primary_10_1109_TGRS_2025_3532349
crossref_primary_10_3390_rs14092253
crossref_primary_10_3390_rs15071768
crossref_primary_10_1016_j_asoc_2021_107789
crossref_primary_10_1109_TGRS_2024_3373033
crossref_primary_10_1016_j_bspc_2023_105241
crossref_primary_10_1016_j_jag_2022_103087
crossref_primary_10_1016_j_cmpbup_2023_100109
crossref_primary_10_1016_j_compbiomed_2024_108947
crossref_primary_10_1016_j_heliyon_2024_e26414
crossref_primary_10_1016_j_measen_2023_100998
crossref_primary_10_1109_TGRS_2023_3281420
crossref_primary_10_1016_j_isprsjprs_2023_08_001
crossref_primary_10_1109_ACCESS_2024_3415169
crossref_primary_10_3390_rs13163211
crossref_primary_10_34133_remotesensing_0078
crossref_primary_10_1016_j_knosys_2024_112203
crossref_primary_10_1016_j_neucom_2025_129593
crossref_primary_10_3390_rs13091749
crossref_primary_10_3390_rs15092231
crossref_primary_10_1117_1_JRS_18_034522
crossref_primary_10_3390_a14060159
crossref_primary_10_1109_ACCESS_2022_3205419
crossref_primary_10_1016_j_bspc_2024_106866
crossref_primary_10_31590_ejosat_1057643
crossref_primary_10_1109_ACCESS_2021_3111899
crossref_primary_10_1109_TGRS_2022_3174651
crossref_primary_10_3390_rs15061701
crossref_primary_10_1186_s40317_021_00248_w
crossref_primary_10_1007_s11517_024_03052_9
crossref_primary_10_3390_app122111226
crossref_primary_10_3390_rs15092464
crossref_primary_10_1109_TGRS_2024_3520610
crossref_primary_10_1007_s11042_023_14752_z
crossref_primary_10_1109_TGRS_2024_3367850
crossref_primary_10_1016_j_zemedi_2023_08_006
crossref_primary_10_1080_07038992_2022_2144179
crossref_primary_10_3390_rs13112077
crossref_primary_10_1016_j_jag_2024_103662
crossref_primary_10_1016_j_jag_2024_103661
crossref_primary_10_1002_ima_23207
crossref_primary_10_1007_s40747_025_01803_1
crossref_primary_10_1016_j_asoc_2024_112399
crossref_primary_10_1289_EHP13214
crossref_primary_10_3390_a17050182
crossref_primary_10_1109_JBHI_2022_3181462
crossref_primary_10_3390_rs14092225
crossref_primary_10_1109_JSTARS_2021_3139017
crossref_primary_10_1016_j_isprsjprs_2023_09_021
crossref_primary_10_1016_j_apgeog_2024_103399
crossref_primary_10_3390_ijgi10100672
crossref_primary_10_1109_TGRS_2024_3354783
crossref_primary_10_1007_s12559_025_10425_1
crossref_primary_10_2174_0126662558275210231121044758
crossref_primary_10_1016_j_optlastec_2024_111222
crossref_primary_10_3390_rs13142788
crossref_primary_10_1109_TGRS_2023_3273818
crossref_primary_10_1109_TGRS_2022_3183144
crossref_primary_10_1007_s11837_024_06681_5
crossref_primary_10_1109_TAI_2024_3363685
crossref_primary_10_3390_electronics13173414
crossref_primary_10_1016_j_inffus_2024_102795
crossref_primary_10_1117_1_JMI_11_2_024004
crossref_primary_10_3390_rs13020197
crossref_primary_10_1016_j_eswa_2023_119950
crossref_primary_10_1016_j_compeleceng_2024_109719
crossref_primary_10_3390_jimaging10120297
crossref_primary_10_3390_app14146299
crossref_primary_10_1007_s10278_021_00571_x
crossref_primary_10_1016_j_scitotenv_2022_155826
crossref_primary_10_3390_s24227266
crossref_primary_10_1109_TMI_2022_3180435
crossref_primary_10_1016_j_isprsjprs_2021_07_001
crossref_primary_10_3390_s22103784
crossref_primary_10_3390_rs13040808
crossref_primary_10_1109_JBHI_2023_3318640
crossref_primary_10_1007_s13748_025_00367_y
crossref_primary_10_1109_TGRS_2022_3233637
crossref_primary_10_1007_s10895_024_04032_w
crossref_primary_10_1016_j_cmpb_2024_108177
crossref_primary_10_1016_j_compbiomed_2024_108759
crossref_primary_10_1109_TGRS_2025_3531879
crossref_primary_10_1016_j_compbiomed_2025_109708
crossref_primary_10_1016_j_ymeth_2024_10_010
crossref_primary_10_1109_TMI_2023_3320151
crossref_primary_10_3390_diagnostics12122952
crossref_primary_10_3390_rs14102291
crossref_primary_10_1016_j_jag_2023_103345
crossref_primary_10_1155_int_9987190
crossref_primary_10_1016_j_knosys_2024_112217
crossref_primary_10_3390_jimaging10070161
crossref_primary_10_47164_ijngc_v13i5_903
crossref_primary_10_1016_j_compbiomed_2024_109617
crossref_primary_10_1080_09507116_2022_2163937
crossref_primary_10_1049_ipr2_12948
crossref_primary_10_3390_rs14184582
crossref_primary_10_3390_rs15184555
crossref_primary_10_1049_ipr2_12708
crossref_primary_10_1142_S0218001423540253
crossref_primary_10_1109_JSTARS_2024_3388464
crossref_primary_10_3390_app11136072
crossref_primary_10_1109_TIM_2025_3545864
crossref_primary_10_1007_s11069_022_05612_4
crossref_primary_10_1016_j_rse_2023_113833
crossref_primary_10_1109_TIV_2022_3216734
crossref_primary_10_5194_amt_17_961_2024
crossref_primary_10_1109_JSTARS_2022_3203750
crossref_primary_10_1016_j_jag_2023_103332
crossref_primary_10_1371_journal_pone_0246071
crossref_primary_10_1109_ACCESS_2024_3407795
crossref_primary_10_3390_rs15081996
crossref_primary_10_1007_s00521_022_07859_1
Cites_doi 10.1016/j.rse.2017.11.026
10.3390/app9102110
10.1016/0005-2795(75)90109-9
10.1016/j.isprsjprs.2017.12.007
10.3390/rs8040329
10.1016/j.rse.2011.04.032
10.1109/TCYB.2016.2531179
10.1109/ICCV.2015.164
10.1109/CVPR.2017.660
10.1016/S0734-189X(86)80047-0
10.1109/CVPRW.2015.7301382
10.1371/journal.pone.0181911
10.1016/j.isprsjprs.2017.08.011
10.1109/CVPR.2018.00747
10.1162/neco_a_00990
10.1007/s11263-009-0275-4
10.1109/TGRS.2016.2616585
10.1109/JSTARS.2016.2645798
10.1016/j.isprsjprs.2019.04.015
10.2307/1932409
10.1109/TGRS.2017.2669341
10.3390/rs10050743
10.1109/ICCV.2017.244
10.1109/TKDE.2009.191
10.1109/CVPR.2009.5206848
10.1007/978-3-319-46976-8_19
10.1016/j.isprsjprs.2017.11.009
10.1016/j.isprsjprs.2013.09.014
10.3390/rs61111372
10.23915/distill.00003
10.1117/12.586823
10.1007/978-3-319-67558-9_28
10.1109/IGARSS.2017.8128165
10.1016/j.rse.2010.12.017
10.1109/ICCV.2017.324
10.1016/j.isprsjprs.2017.11.011
10.1109/CVPRW.2017.200
10.3390/rs9040368
10.1109/TMI.2006.880587
10.3390/rs10111768
10.3390/rs8030232
10.3390/rs3081777
10.1162/neco.1989.1.4.541
10.1109/34.87344
10.1109/MGRS.2017.2762307
10.1109/34.1000236
10.1109/TGRS.2015.2400462
10.1109/ICCV.2017.322
10.1109/JSTARS.2016.2582921
ContentType Journal Article
Copyright 2020 International Society for Photogrammetry and Remote Sensing, Inc. (ISPRS)
Copyright_xml – notice: 2020 International Society for Photogrammetry and Remote Sensing, Inc. (ISPRS)
DBID AAYXX
CITATION
7S9
L.6
DOI 10.1016/j.isprsjprs.2020.01.013
DatabaseName CrossRef
AGRICOLA
AGRICOLA - Academic
DatabaseTitle CrossRef
AGRICOLA
AGRICOLA - Academic
DatabaseTitleList AGRICOLA

DeliveryMethod fulltext_linktorsrc
Discipline Geography
Architecture
Engineering
EISSN 1872-8235
EndPage 114
ExternalDocumentID 10_1016_j_isprsjprs_2020_01_013
S0924271620300149
GroupedDBID --K
--M
.~1
0R~
1B1
1RT
1~.
1~5
29J
4.4
457
4G.
5GY
5VS
7-5
71M
8P~
9JN
AACTN
AAEDT
AAEDW
AAIAV
AAIKC
AAIKJ
AAKOC
AALRI
AAMNW
AAOAW
AAQFI
AAQXK
AAXUO
AAYFN
ABBOA
ABFNM
ABJNI
ABMAC
ABQEM
ABQYD
ABXDB
ABYKQ
ACDAQ
ACGFS
ACLVX
ACNNM
ACRLP
ACSBN
ACZNC
ADBBV
ADEZE
ADJOM
ADMUD
AEBSH
AEKER
AENEX
AFKWA
AFTJW
AGHFR
AGUBO
AGYEJ
AHHHB
AHZHX
AIALX
AIEXJ
AIKHN
AITUG
AJBFU
AJOXV
ALMA_UNASSIGNED_HOLDINGS
AMFUW
AMRAJ
AOUOD
ASPBG
ATOGT
AVWKF
AXJTR
AZFZN
BKOJK
BLXMC
CS3
DU5
EBS
EFJIC
EFLBG
EJD
EO8
EO9
EP2
EP3
FDB
FEDTE
FGOYB
FIRID
FNPLU
FYGXN
G-2
G-Q
G8K
GBLVA
GBOLZ
HMA
HVGLF
HZ~
H~9
IHE
IMUCA
J1W
KOM
LY3
M41
MO0
N9A
O-L
O9-
OAUVE
OZT
P-8
P-9
P2P
PC.
Q38
R2-
RIG
RNS
ROL
RPZ
SDF
SDG
SEP
SES
SEW
SPC
SPCBC
SSE
SSV
SSZ
T5K
T9H
WUQ
ZMT
~02
~G-
AAHBH
AATTM
AAXKI
AAYWO
AAYXX
ABDPE
ABWVN
ACRPL
ACVFH
ADCNI
ADNMO
AEIPS
AEUPX
AFJKZ
AFPUW
AFXIZ
AGCQF
AGQPQ
AGRNS
AIGII
AIIUN
AKBMS
AKRWK
AKYEP
ANKPU
APXCP
BNPGV
CITATION
SSH
7S9
L.6
ID FETCH-LOGICAL-c348t-ec5a11e77df8eb06ecd1e22574f12f2b7ab067e54d8b1e0c5a2c38b210b8447a3
IEDL.DBID .~1
ISSN 0924-2716
IngestDate Fri Jul 11 08:56:32 EDT 2025
Tue Jul 01 03:46:43 EDT 2025
Thu Apr 24 23:10:01 EDT 2025
Fri Feb 23 02:47:45 EST 2024
IsPeerReviewed true
IsScholarly true
Keywords Loss function
Data augmentation
Architecture
Very high spatial resolution
Convolutional neural network
Language English
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c348t-ec5a11e77df8eb06ecd1e22574f12f2b7ab067e54d8b1e0c5a2c38b210b8447a3
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
PQID 2400520889
PQPubID 24069
PageCount 21
ParticipantIDs proquest_miscellaneous_2400520889
crossref_primary_10_1016_j_isprsjprs_2020_01_013
crossref_citationtrail_10_1016_j_isprsjprs_2020_01_013
elsevier_sciencedirect_doi_10_1016_j_isprsjprs_2020_01_013
ProviderPackageCode CITATION
AAYXX
PublicationCentury 2000
PublicationDate April 2020
2020-04-00
20200401
PublicationDateYYYYMMDD 2020-04-01
PublicationDate_xml – month: 04
  year: 2020
  text: April 2020
PublicationDecade 2020
PublicationTitle ISPRS journal of photogrammetry and remote sensing
PublicationYear 2020
Publisher Elsevier B.V
Publisher_xml – name: Elsevier B.V
References Badrinarayanan, V., Kendall, A., Cipolla, R., 2015. Segnet: A deep convolutional encoder-decoder architecture for image segmentation. CoRR abs/1511.00561.
Borgefors (b0045) 1986; 34
Zhu, J., Park, T., Isola, P., Efros, A.A., 2017. Unpaired image-to-image translation using cycle-consistent adversarial networks. CoRR abs/1703.10593.
Lin, T., Goyal, P., Girshick, R.B., He, K., Dollár, P., 2017. Focal loss for dense object detection. CoRR abs/1708.02002.
ISPRS, International society for photogrammetry and remote sensing (isprs) and bsf swissphoto: Wg3 potsdam overhead data.
Kingma, D.P., Ba, J., 2014. Adam: A method for stochastic optimization. CoRR abs/1412.6980.
Taghanaki, S.A., Abhishek, K., Cohen, J.P., Cohen-Adad, J., Hamarneh, G., 2019. Deep semantic segmentation of natural and medical images: a review arXiv
Sørensen (b0340) 1948; 5
Chen, L., Papandreou, G., Schroff, F., Adam, H., 2017. Rethinking atrous convolution for semantic image segmentation. CoRR abs/1706.05587.
Vadivel, A., Sural, Shamik, Majumdar, A.K., 2005. Human color perception in the hsv space and its application in histogram generation for image retrieval. doi
Pan, Gao, Marinoni, Zhang, Yang, Gamba (b0290) 2018; 10
Pan, Gao, Zhang, Yang, Liao (b0295) 2018
Audebert, N., Saux, B.L., Lefèvre, S., 2016. Semantic segmentation of earth observation data using multimodal and multi-scale deep networks. CoRR abs/1609.06846.
He, K., Gkioxari, G., Dollár, P., Girshick, R.B., 2017. Mask R-CNN. CoRR abs/1703.06870.
He, K., Zhang, X., Ren, S., Sun, J., 2015. Deep residual learning for image recognition. CoRR abs/1512.03385.
Liu, Minh Nguyen, Deligiannis, Ding, Munteanu (b0215) 2017; 9
Myint, Gober, Brazel, Grossman-Clarke, Weng (b0265) 2011; 115
Audebert, Le Saux, Lefèvre (b0010) 2018; 140
Goyal, P., Dollár, P., Girshick, R.B., Noordhuis, P., Wesolowski, L., Kyrola, A., Tulloch, A., Jia, Y., He, K., 2017. Accurate, large minibatch SGD: training imagenet in 1 hour. CoRR abs/1706.02677.
Ioffe, S., Szegedy, C., 2015. Batch normalization: Accelerating deep network training by reducing internal covariate shift. CoRR abs/1502.03167.
Jaderberg, M., Simonyan, K., Zisserman, A., Kavukcuoglu, K., 2015. Spatial transformer networks. CoRR abs/1506.02025.
Li, S., Jiao, J., Han, Y., Weissman, T., 2016. Demystifying resnet. CoRR abs/1611.01186.
Zhao, Du, Wang, Emery (b0420) 2017; 132
Lu, Yuan, Zheng (b0230) 2017; 47
Matikainen, Karila (b0250) 2011; 3
Wen, Huang, Liu, Liao, Zhang (b0375) 2017; 10
Zhu, Tuia, Mou, Xia, Zhang, Xu, Fraundorfer (b0430) 2017; 5
Blaschke, Hay, Kelly, Lang, Hofmann, Addink, Feitosa, Van der Meer, Van der Werff, Van Coillie (b0040) 2014; 87
He, K., Zhang, X., Ren, S., Sun, J., 2014. Spatial pyramid pooling in deep convolutional networks for visual recognition. CoRR abs/1406.4729.
Liu, Y., Piramanayagam, S., Monteiro, S.T., Saber, E., 2017b. Dense semantic labeling of very-high-resolution aerial imagery and lidar with fully-convolutional neural networks and higher-order crfs. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR) Workshops, Honolulu, USA.
.
Goodfellow, I., Pouget-Abadie, J., Mirza, M., Xu, B., Warde-Farley, D., Ozair, S., Courville, A., Bengio, Y., 2014. Generative adversarial nets. In: Ghahramani, Z., Welling, M., Cortes, C., Lawrence, N.D., Weinberger, K.Q. (Eds.), Advances in Neural Information Processing Systems, vol. 27. Curran Associates, Inc., pp. 2672–2680.
Ma, Liu, Zhang, Ye, Yin, Johnson (b0235) 2019; 152
Sudre, C.H., Li, W., Vercauteren, T., Ourselin, S., Cardoso, M.J., 2017. Generalised dice overlap as a deep learning loss function for highly unbalanced segmentations. CoRR abs/1707.03237.
Kervadec, H., Bouchtiba, J., Desrosiers, C., Ric Granger, Dolz, J., Ayed, I.B., 2018. Boundary loss for highly unbalanced segmentation arXiv
Baatz, M., Schäpe, A., 2000. Multiresolution segmentation: an optimization approach for high quality multi-scale image segmentation (ecognition), 12–23.
Vincent, Soille (b0360) 1991
Lambert, Waldner, Defourny (b0175) 2016; 8
He, K., Zhang, X., Ren, S., Sun, J., 2016. Identity mappings in deep residual networks. CoRR abs/1603.05027.
Zhang, Z., Liu, Q., Wang, Y., 2017. Road extraction by deep residual u-net. CoRR abs/1711.10684.
doi
Bertasius, G., Shi, J., Torresani, L., 2015. Semantic segmentation with boundary neural fields. CoRR abs/1511.02674.
Goldblatt, Stuhlmacher, Tellman, Clinton, Hanson, Georgescu, Wang, Serrano-Candela, Khandelwal, Cheng (b0100) 2018; 205
Pan, Yang (b0285) 2010; 22
Cheng, Wang, Xu, Wang, Xiang, Pan (b0065) 2017; 55
He, K., Girshick, R.B., Dollár, P., 2018. Rethinking imagenet pre-training. CoRR abs/1811.08883.
Rawat, Wang (b0310) 2017; 29
Liu, Fan, Wang, Bai, Xiang, Pan (b0210) 2018; 145
Audebert, Le Saux, Lefévre (b0015) 2017; 9
Xie, S., Tu, Z., 2015. Holistically-nested edge detection. CoRR abs/1504.06375.
Drozdzal, M., Vorontsov, E., Chartrand, G., Kadoury, S., Pal, C., 2016. The importance of skip connections in biomedical image segmentation. CoRR abs/1608.04117.
Sergeev, A., Balso, M.D., 2018. Horovod: fast and easy distributed deep learning in TensorFlow. arXiv preprint arXiv
Chen, L., Papandreou, G., Kokkinos, I., Murphy, K., Yuille, A.L., 2016. Deeplab: Semantic image segmentation with deep convolutional nets, atrous convolution, and fully connected crfs. CoRR abs/1606.00915.
Ruder, S., 2017. An overview of multi-task learning in deep neural networks. CoRR abs/1706.05098.
Huang, G., Liu, Z., Weinberger, K.Q., 2016. Densely connected convolutional networks. CoRR abs/1608.06993.
Odena, Dumoulin, Olah (b0275) 2016
Yang, Wu, Yao, Wu, Wang, Xu (b0390) 2018; 10
Li, Shao (b0200) 2014; 6
Long, J., Shelhamer, E., Darrell, T., 2014. Fully convolutional networks for semantic segmentation. CoRR abs/1411.4038.
Zagoruyko, S., Komodakis, N., 2016. Wide residual networks. CoRR abs/1605.07146. http://arxiv.org/abs/1605.07146, arXiv:1605.07146.
Penatti, O.A., Nogueira, K., dos Santos, J.A., 2015. Do deep features generalize from everyday objects to remote sensing and aerial scenes domains? In: 2015 IEEE Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), pp. 44–51.
Paisitkriangkrai, Sherrah, Janney, van den Hengel (b0280) 2016; 9
Gu, Wang, Li (b0115) 2019; 9
Längkvist, Kiselev, Alirezaie, Loutfi (b0180) 2016; 8
Novikov, A.A., Major, D., Lenis, D., Hladuvka, J., Wimmer, M., Bühler, K., 2017. Fully convolutional architectures for multi-class segmentation in chest radiographs. CoRR abs/1701.08816.
Piramanayagam, Saber, Schwartzkopf, Koehler (b0305) 2018
Volpi, Tuia (b0365) 2017; 55
Zhao, H., Shi, J., Qi, X., Wang, X., Jia, J., 2017a. Pyramid scene parsing network. In: CVPR.
Everingham, Van Gool, Williams, Winn, Zisserman (b0095) 2010; 88
Deng, J., Dong, W., Socher, R., Li, L.J., Li, K., Fei-Fei, L., 2009. ImageNet: A Large-Scale Hierarchical Image Database. In: CVPR09.
LeCun, Boser, Denker, Henderson, Howard, Hubbard, Jackel (b0185) 1989; 1
Xie, S.M., Jean, N., Burke, M., Lobell, D.B., Ermon, S., 2015. Transfer learning from deep features for remote sensing and poverty mapping. CoRR abs/1510.00098.
Chen, T., Li, M., Li, Y., Lin, M., Wang, N., Wang, M., Xiao, T., Xu, B., Zhang, C., Zhang, Z., 2015. Mxnet: A flexible and efficient machine learning library for heterogeneous distributed systems. arXiv preprint arXiv
Abraham, N., Khan, N.M., 2018. A novel focal tversky loss function with improved attention u-net for lesion segmentation. CoRR abs/1810.07842.
Waldner, Hansen, Potapov, Löw, Newby, Ferreira, Defourny (b0370) 2017; 12
Crum, Camara, Hill (b0075) 2006; 25
Li, Femiani, Xu, Zhang, Wonka (b0190) 2015; 53
Sherrah, J., 2016. Fully convolutional networks for dense semantic labelling of high-resolution aerial imagery. CoRR abs/1606.02585.
Matthews (b0255) 1975; 405
Dice, L.R., 1945. Measures of the amount of ecologic association between species. Ecology 26, 297–302. doi
Smith, L.N., 2018. A disciplined approach to neural network hyper-parameters: Part 1 – learning rate, batch size, momentum, and weight decay. CoRR abs/1803.09820.
Zhang, Seto (b0405) 2011; 115
Ronneberger, O., Fischer, P., Brox, T., 2015. U-net: Convolutional networks for biomedical image segmentation. CoRR abs/1505.04597.
Marmanis, D., Wegner, J.D., Galliani, S., Schindler, K., Datcu, M., Stilla, U., 2016. Semantic segmentation of aerial images with an ensemble of cnns.
Marmanis, Schindler, Wegner, Galliani, Datcu, Stilla (b0240) 2018; 135
Zhang, H., Dana, K., Shi, J., Zhang, Z., Wang, X., Tyagi, A., Agrawal, A., 2018. Context encoding for semantic segmentation. In: The IEEE Conference on Computer Vision and Pattern Recognition (CVPR).
Comaniciu, Meer (b0070) 2002; 24
Milletari, F., Navab, N., Ahmadi, S., 2016. V-net: Fully convolutional neural networks for volumetric medical image segmentation. CoRR abs/1606.04797.
Borgefors (10.1016/j.isprsjprs.2020.01.013_b0045) 1986; 34
Ma (10.1016/j.isprsjprs.2020.01.013_b0235) 2019; 152
10.1016/j.isprsjprs.2020.01.013_b0315
Yang (10.1016/j.isprsjprs.2020.01.013_b0390) 2018; 10
Cheng (10.1016/j.isprsjprs.2020.01.013_b0065) 2017; 55
Paisitkriangkrai (10.1016/j.isprsjprs.2020.01.013_b0280) 2016; 9
Pan (10.1016/j.isprsjprs.2020.01.013_b0295) 2018
10.1016/j.isprsjprs.2020.01.013_b0120
10.1016/j.isprsjprs.2020.01.013_b0165
Vincent (10.1016/j.isprsjprs.2020.01.013_b0360) 1991
10.1016/j.isprsjprs.2020.01.013_b0320
10.1016/j.isprsjprs.2020.01.013_b0245
Rawat (10.1016/j.isprsjprs.2020.01.013_b0310) 2017; 29
Crum (10.1016/j.isprsjprs.2020.01.013_b0075) 2006; 25
10.1016/j.isprsjprs.2020.01.013_b0125
10.1016/j.isprsjprs.2020.01.013_b0400
10.1016/j.isprsjprs.2020.01.013_b0005
10.1016/j.isprsjprs.2020.01.013_b0080
LeCun (10.1016/j.isprsjprs.2020.01.013_b0185) 1989; 1
10.1016/j.isprsjprs.2020.01.013_b0160
10.1016/j.isprsjprs.2020.01.013_b0085
Blaschke (10.1016/j.isprsjprs.2020.01.013_b0040) 2014; 87
Everingham (10.1016/j.isprsjprs.2020.01.013_b0095) 2010; 88
10.1016/j.isprsjprs.2020.01.013_b0105
10.1016/j.isprsjprs.2020.01.013_b0425
Comaniciu (10.1016/j.isprsjprs.2020.01.013_b0070) 2002; 24
Pan (10.1016/j.isprsjprs.2020.01.013_b0285) 2010; 22
10.1016/j.isprsjprs.2020.01.013_b0350
Lu (10.1016/j.isprsjprs.2020.01.013_b0230) 2017; 47
10.1016/j.isprsjprs.2020.01.013_b0395
10.1016/j.isprsjprs.2020.01.013_b0110
10.1016/j.isprsjprs.2020.01.013_b0155
10.1016/j.isprsjprs.2020.01.013_b0035
10.1016/j.isprsjprs.2020.01.013_b0355
Li (10.1016/j.isprsjprs.2020.01.013_b0190) 2015; 53
Liu (10.1016/j.isprsjprs.2020.01.013_b0210) 2018; 145
Matikainen (10.1016/j.isprsjprs.2020.01.013_b0250) 2011; 3
Wen (10.1016/j.isprsjprs.2020.01.013_b0375) 2017; 10
Goldblatt (10.1016/j.isprsjprs.2020.01.013_b0100) 2018; 205
10.1016/j.isprsjprs.2020.01.013_b0270
10.1016/j.isprsjprs.2020.01.013_b0150
10.1016/j.isprsjprs.2020.01.013_b0030
10.1016/j.isprsjprs.2020.01.013_b0195
Li (10.1016/j.isprsjprs.2020.01.013_b0200) 2014; 6
Gu (10.1016/j.isprsjprs.2020.01.013_b0115) 2019; 9
10.1016/j.isprsjprs.2020.01.013_b0415
10.1016/j.isprsjprs.2020.01.013_b0020
10.1016/j.isprsjprs.2020.01.013_b0220
Matthews (10.1016/j.isprsjprs.2020.01.013_b0255) 1975; 405
10.1016/j.isprsjprs.2020.01.013_b0385
Odena (10.1016/j.isprsjprs.2020.01.013_b0275) 2016
10.1016/j.isprsjprs.2020.01.013_b0145
Liu (10.1016/j.isprsjprs.2020.01.013_b0215) 2017; 9
10.1016/j.isprsjprs.2020.01.013_b0025
10.1016/j.isprsjprs.2020.01.013_b0300
Zhu (10.1016/j.isprsjprs.2020.01.013_b0430) 2017; 5
10.1016/j.isprsjprs.2020.01.013_b0345
10.1016/j.isprsjprs.2020.01.013_b0225
Längkvist (10.1016/j.isprsjprs.2020.01.013_b0180) 2016; 8
10.1016/j.isprsjprs.2020.01.013_b0060
Zhao (10.1016/j.isprsjprs.2020.01.013_b0420) 2017; 132
10.1016/j.isprsjprs.2020.01.013_b0380
Marmanis (10.1016/j.isprsjprs.2020.01.013_b0240) 2018; 135
10.1016/j.isprsjprs.2020.01.013_b0260
Volpi (10.1016/j.isprsjprs.2020.01.013_b0365) 2017; 55
10.1016/j.isprsjprs.2020.01.013_b0140
Myint (10.1016/j.isprsjprs.2020.01.013_b0265) 2011; 115
Zhang (10.1016/j.isprsjprs.2020.01.013_b0405) 2011; 115
Piramanayagam (10.1016/j.isprsjprs.2020.01.013_b0305) 2018
Pan (10.1016/j.isprsjprs.2020.01.013_b0290) 2018; 10
10.1016/j.isprsjprs.2020.01.013_b0325
10.1016/j.isprsjprs.2020.01.013_b0205
Audebert (10.1016/j.isprsjprs.2020.01.013_b0010) 2018; 140
Audebert (10.1016/j.isprsjprs.2020.01.013_b0015) 2017; 9
Sørensen (10.1016/j.isprsjprs.2020.01.013_b0340) 1948; 5
10.1016/j.isprsjprs.2020.01.013_b0130
10.1016/j.isprsjprs.2020.01.013_b0055
Lambert (10.1016/j.isprsjprs.2020.01.013_b0175) 2016; 8
10.1016/j.isprsjprs.2020.01.013_b0330
10.1016/j.isprsjprs.2020.01.013_b0135
10.1016/j.isprsjprs.2020.01.013_b0410
Waldner (10.1016/j.isprsjprs.2020.01.013_b0370) 2017; 12
10.1016/j.isprsjprs.2020.01.013_b0335
10.1016/j.isprsjprs.2020.01.013_b0090
10.1016/j.isprsjprs.2020.01.013_b0170
10.1016/j.isprsjprs.2020.01.013_b0050
References_xml – volume: 55
  start-page: 3322
  year: 2017
  end-page: 3337
  ident: b0065
  article-title: Automatic road detection and centerline extraction via cascaded end-to-end convolutional neural network
  publication-title: IEEE Trans. Geosci. Remote Sens.
– reference: Chen, T., Li, M., Li, Y., Lin, M., Wang, N., Wang, M., Xiao, T., Xu, B., Zhang, C., Zhang, Z., 2015. Mxnet: A flexible and efficient machine learning library for heterogeneous distributed systems. arXiv preprint arXiv:
– reference: He, K., Girshick, R.B., Dollár, P., 2018. Rethinking imagenet pre-training. CoRR abs/1811.08883.
– volume: 88
  start-page: 303
  year: 2010
  end-page: 338
  ident: b0095
  article-title: The pascal visual object classes (voc) challenge
  publication-title: Int. J. Comput. Vision
– reference: Taghanaki, S.A., Abhishek, K., Cohen, J.P., Cohen-Adad, J., Hamarneh, G., 2019. Deep semantic segmentation of natural and medical images: a review arXiv:
– volume: 29
  start-page: 2352
  year: 2017
  end-page: 2449
  ident: b0310
  article-title: Deep convolutional neural networks for image classification: a comprehensive review
  publication-title: Neural Comput.
– reference: Xie, S., Tu, Z., 2015. Holistically-nested edge detection. CoRR abs/1504.06375.
– volume: 135
  start-page: 158
  year: 2018
  end-page: 172
  ident: b0240
  article-title: Classification with an edge: Improving semantic image segmentation with boundary detection
  publication-title: ISPRS J. Photogramm. Remote Sens.
– start-page: 583
  year: 1991
  end-page: 598
  ident: b0360
  article-title: Watersheds in digital spaces: an efficient algorithm based on immersion simulations
  publication-title: IEEE Trans. Pattern Anal. Mach. Intell.
– reference: Huang, G., Liu, Z., Weinberger, K.Q., 2016. Densely connected convolutional networks. CoRR abs/1608.06993.
– reference: Sherrah, J., 2016. Fully convolutional networks for dense semantic labelling of high-resolution aerial imagery. CoRR abs/1606.02585.
– reference: Chen, L., Papandreou, G., Kokkinos, I., Murphy, K., Yuille, A.L., 2016. Deeplab: Semantic image segmentation with deep convolutional nets, atrous convolution, and fully connected crfs. CoRR abs/1606.00915.
– volume: 12
  year: 2017
  ident: b0370
  article-title: National-scale cropland mapping based on spectral-temporal features and outdated land cover information
  publication-title: PloS One
– volume: 53
  start-page: 4483
  year: 2015
  end-page: 4495
  ident: b0190
  article-title: Robust rooftop extraction from visible band images using higher order crf
  publication-title: IEEE Trans. Geosci. Remote Sens.
– reference: Kervadec, H., Bouchtiba, J., Desrosiers, C., Ric Granger, Dolz, J., Ayed, I.B., 2018. Boundary loss for highly unbalanced segmentation arXiv:
– volume: 47
  start-page: 884
  year: 2017
  end-page: 897
  ident: b0230
  article-title: Joint dictionary learning for multispectral change detection
  publication-title: IEEE Trans. Cybernetics
– reference: Bertasius, G., Shi, J., Torresani, L., 2015. Semantic segmentation with boundary neural fields. CoRR abs/1511.02674.
– volume: 87
  start-page: 180
  year: 2014
  end-page: 191
  ident: b0040
  article-title: Geographic object-based image analysis–towards a new paradigm
  publication-title: ISPRS J. Photogramm. Remote Sens.
– reference: Vadivel, A., Sural, Shamik, Majumdar, A.K., 2005. Human color perception in the hsv space and its application in histogram generation for image retrieval. doi:
– reference: Goyal, P., Dollár, P., Girshick, R.B., Noordhuis, P., Wesolowski, L., Kyrola, A., Tulloch, A., Jia, Y., He, K., 2017. Accurate, large minibatch SGD: training imagenet in 1 hour. CoRR abs/1706.02677.
– reference: He, K., Zhang, X., Ren, S., Sun, J., 2016. Identity mappings in deep residual networks. CoRR abs/1603.05027.
– reference: He, K., Zhang, X., Ren, S., Sun, J., 2015. Deep residual learning for image recognition. CoRR abs/1512.03385.
– start-page: 18
  year: 2018
  ident: b0295
  article-title: High-resolution aerial imagery semantic labeling with dense pyramid network
  publication-title: Sensors
– volume: 10
  start-page: 1413
  year: 2017
  end-page: 1424
  ident: b0375
  article-title: Semantic classification of urban trees using very high resolution satellite imagery
  publication-title: IEEE J. Sel. Top. Appl. Earth Observ. Remote Sens.
– volume: 152
  start-page: 166
  year: 2019
  end-page: 177
  ident: b0235
  article-title: Deep learning in remote sensing applications: a meta-analysis and review
  publication-title: ISPRS J. Photogramm. Remote Sens.
– volume: 10
  year: 2018
  ident: b0290
  article-title: Semantic labeling of high resolution aerial imagery and lidar data with fine segmentation network
  publication-title: Remote Sens.
– volume: 55
  start-page: 881
  year: 2017
  end-page: 893
  ident: b0365
  article-title: Dense semantic labeling of subdecimeter resolution images with convolutional neural networks
  publication-title: IEEE Trans. Geosci. Remote Sens.
– reference: Baatz, M., Schäpe, A., 2000. Multiresolution segmentation: an optimization approach for high quality multi-scale image segmentation (ecognition), 12–23.
– year: 2016
  ident: b0275
  article-title: Deconvolution and checkerboard artifacts
  publication-title: Distill
– reference: Badrinarayanan, V., Kendall, A., Cipolla, R., 2015. Segnet: A deep convolutional encoder-decoder architecture for image segmentation. CoRR abs/1511.00561.
– volume: 1
  start-page: 541
  year: 1989
  end-page: 551
  ident: b0185
  article-title: Backpropagation applied to handwritten zip code recognition
  publication-title: Neural Comput.
– volume: 3
  start-page: 1777
  year: 2011
  end-page: 1804
  ident: b0250
  article-title: Segment-based land cover mapping of a suburban areacomparison of high-resolution remotely sensed datasets using classification trees and test field points
  publication-title: Remote Sens.
– reference: Zhang, Z., Liu, Q., Wang, Y., 2017. Road extraction by deep residual u-net. CoRR abs/1711.10684.
– reference: Sergeev, A., Balso, M.D., 2018. Horovod: fast and easy distributed deep learning in TensorFlow. arXiv preprint arXiv:
– volume: 205
  start-page: 253
  year: 2018
  end-page: 275
  ident: b0100
  article-title: Using landsat and nighttime lights for supervised pixel-based image classification of urban land cover
  publication-title: Remote Sens. Environ.
– reference: Chen, L., Papandreou, G., Schroff, F., Adam, H., 2017. Rethinking atrous convolution for semantic image segmentation. CoRR abs/1706.05587.
– reference: Novikov, A.A., Major, D., Lenis, D., Hladuvka, J., Wimmer, M., Bühler, K., 2017. Fully convolutional architectures for multi-class segmentation in chest radiographs. CoRR abs/1701.08816.
– volume: 9
  start-page: 2868
  year: 2016
  end-page: 2881
  ident: b0280
  article-title: Semantic labeling of aerial and satellite imagery
  publication-title: IEEE J. Sel. Top. Appl. Earth Observ. Remote Sens.
– start-page: 10
  year: 2018
  ident: b0305
  article-title: Supervised classification of multisensor remotely sensed images using a deep learning framework
  publication-title: Remote Sens.
– volume: 145
  start-page: 78
  year: 2018
  end-page: 95
  ident: b0210
  article-title: Semantic labeling in very high resolution images via a self-cascaded convolutional neural network
  publication-title: ISPRS J. Photogramm. Remote Sens.
– reference: He, K., Gkioxari, G., Dollár, P., Girshick, R.B., 2017. Mask R-CNN. CoRR abs/1703.06870.
– reference: , doi:
– volume: 115
  start-page: 2320
  year: 2011
  end-page: 2329
  ident: b0405
  article-title: Mapping urbanization dynamics at regional and global scales using multi-temporal dmsp/ols nighttime light data
  publication-title: Remote Sens. Environ.
– reference: Ruder, S., 2017. An overview of multi-task learning in deep neural networks. CoRR abs/1706.05098.
– reference: Dice, L.R., 1945. Measures of the amount of ecologic association between species. Ecology 26, 297–302. doi:
– volume: 9
  year: 2017
  ident: b0015
  article-title: Segment-before-detect: vehicle detection and classification through semantic segmentation of aerial images
  publication-title: Remote Sens.
– reference: Lin, T., Goyal, P., Girshick, R.B., He, K., Dollár, P., 2017. Focal loss for dense object detection. CoRR abs/1708.02002.
– volume: 9
  year: 2019
  ident: b0115
  article-title: A survey on deep learning-driven remote sensing image scene understanding: Scene classification, scene retrieval and scene-guided object detection
  publication-title: Appl. Sci.
– reference: Xie, S.M., Jean, N., Burke, M., Lobell, D.B., Ermon, S., 2015. Transfer learning from deep features for remote sensing and poverty mapping. CoRR abs/1510.00098.
– reference: Penatti, O.A., Nogueira, K., dos Santos, J.A., 2015. Do deep features generalize from everyday objects to remote sensing and aerial scenes domains? In: 2015 IEEE Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), pp. 44–51.
– volume: 9
  year: 2017
  ident: b0215
  article-title: Hourglass-shapenetwork based semantic segmentation for high resolution aerial imagery
  publication-title: Remote Sens.
– reference: Abraham, N., Khan, N.M., 2018. A novel focal tversky loss function with improved attention u-net for lesion segmentation. CoRR abs/1810.07842.
– volume: 25
  start-page: 1451
  year: 2006
  end-page: 1461
  ident: b0075
  article-title: Generalized overlap measures for evaluation and validation in medical image analysis
  publication-title: IEEE Trans. Med. Imaging
– reference: Long, J., Shelhamer, E., Darrell, T., 2014. Fully convolutional networks for semantic segmentation. CoRR abs/1411.4038.
– volume: 8
  start-page: 232
  year: 2016
  ident: b0175
  article-title: Cropland mapping over sahelian and sudanian agrosystems: a knowledge-based approach using proba-v time series at 100-m
  publication-title: Remote Sens.
– reference: Deng, J., Dong, W., Socher, R., Li, L.J., Li, K., Fei-Fei, L., 2009. ImageNet: A Large-Scale Hierarchical Image Database. In: CVPR09.
– reference: Liu, Y., Piramanayagam, S., Monteiro, S.T., Saber, E., 2017b. Dense semantic labeling of very-high-resolution aerial imagery and lidar with fully-convolutional neural networks and higher-order crfs. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR) Workshops, Honolulu, USA.
– volume: 24
  start-page: 603
  year: 2002
  end-page: 619
  ident: b0070
  article-title: Mean shift: a robust approach toward feature space analysis
  publication-title: IEEE Trans. Pattern Anal. Mach. Intell.
– reference: Jaderberg, M., Simonyan, K., Zisserman, A., Kavukcuoglu, K., 2015. Spatial transformer networks. CoRR abs/1506.02025.
– reference: Goodfellow, I., Pouget-Abadie, J., Mirza, M., Xu, B., Warde-Farley, D., Ozair, S., Courville, A., Bengio, Y., 2014. Generative adversarial nets. In: Ghahramani, Z., Welling, M., Cortes, C., Lawrence, N.D., Weinberger, K.Q. (Eds.), Advances in Neural Information Processing Systems, vol. 27. Curran Associates, Inc., pp. 2672–2680.
– reference: ISPRS, International society for photogrammetry and remote sensing (isprs) and bsf swissphoto: Wg3 potsdam overhead data.
– reference: Milletari, F., Navab, N., Ahmadi, S., 2016. V-net: Fully convolutional neural networks for volumetric medical image segmentation. CoRR abs/1606.04797.
– reference: Audebert, N., Saux, B.L., Lefèvre, S., 2016. Semantic segmentation of earth observation data using multimodal and multi-scale deep networks. CoRR abs/1609.06846.
– volume: 22
  start-page: 1345
  year: 2010
  end-page: 1359
  ident: b0285
  article-title: A survey on transfer learning
  publication-title: IEEE Trans. Knowl. Data Eng.
– reference: Li, S., Jiao, J., Han, Y., Weissman, T., 2016. Demystifying resnet. CoRR abs/1611.01186.
– reference: Sudre, C.H., Li, W., Vercauteren, T., Ourselin, S., Cardoso, M.J., 2017. Generalised dice overlap as a deep learning loss function for highly unbalanced segmentations. CoRR abs/1707.03237.
– reference: Zhao, H., Shi, J., Qi, X., Wang, X., Jia, J., 2017a. Pyramid scene parsing network. In: CVPR.
– volume: 10
  year: 2018
  ident: b0390
  article-title: Building extraction in very high resolution imagery by dense-attention networks
  publication-title: Remote Sens.
– volume: 8
  start-page: 329
  year: 2016
  ident: b0180
  article-title: Classification and segmentation of satellite orthoimagery using convolutional neural networks
  publication-title: Remote Sens.
– volume: 5
  start-page: 1
  year: 1948
  end-page: 34
  ident: b0340
  article-title: A method of establishing groups of equal amplitude in plant sociology based on similarity of species and its application to analyses of the vegetation on Danish commons
  publication-title: Biol. Skr.
– volume: 5
  start-page: 8
  year: 2017
  end-page: 36
  ident: b0430
  article-title: Deep learning in remote sensing: a comprehensive review and list of resources
  publication-title: IEEE Geosci. Remote Sens. Mag.
– reference: Zhang, H., Dana, K., Shi, J., Zhang, Z., Wang, X., Tyagi, A., Agrawal, A., 2018. Context encoding for semantic segmentation. In: The IEEE Conference on Computer Vision and Pattern Recognition (CVPR).
– reference: .
– reference: Marmanis, D., Wegner, J.D., Galliani, S., Schindler, K., Datcu, M., Stilla, U., 2016. Semantic segmentation of aerial images with an ensemble of cnns.
– reference: Ronneberger, O., Fischer, P., Brox, T., 2015. U-net: Convolutional networks for biomedical image segmentation. CoRR abs/1505.04597.
– reference: Zhu, J., Park, T., Isola, P., Efros, A.A., 2017. Unpaired image-to-image translation using cycle-consistent adversarial networks. CoRR abs/1703.10593.
– reference: Smith, L.N., 2018. A disciplined approach to neural network hyper-parameters: Part 1 – learning rate, batch size, momentum, and weight decay. CoRR abs/1803.09820.
– volume: 115
  start-page: 1145
  year: 2011
  end-page: 1161
  ident: b0265
  article-title: Per-pixel vs. object-based classification of urban land cover extraction using high spatial resolution imagery
  publication-title: Remote Sens. Environ.
– reference: Zagoruyko, S., Komodakis, N., 2016. Wide residual networks. CoRR abs/1605.07146. http://arxiv.org/abs/1605.07146, arXiv:1605.07146.
– volume: 132
  start-page: 48
  year: 2017
  end-page: 60
  ident: b0420
  article-title: Contextually guided very-high-resolution imagery classification with semantic segments
  publication-title: ISPRS J. Photogramm. Remote Sens.
– volume: 140
  start-page: 20
  year: 2018
  end-page: 32
  ident: b0010
  article-title: Beyond rgb: Very high resolution urban remote sensing with multimodal deep networks
  publication-title: ISPRS J. Photogramm. Remote Sens.
– reference: Drozdzal, M., Vorontsov, E., Chartrand, G., Kadoury, S., Pal, C., 2016. The importance of skip connections in biomedical image segmentation. CoRR abs/1608.04117.
– volume: 34
  start-page: 344
  year: 1986
  end-page: 371
  ident: b0045
  article-title: Distance transformations in digital images
  publication-title: Comput. Vision Graph. Image Process.
– reference: He, K., Zhang, X., Ren, S., Sun, J., 2014. Spatial pyramid pooling in deep convolutional networks for visual recognition. CoRR abs/1406.4729.
– volume: 6
  start-page: 11372
  year: 2014
  end-page: 11390
  ident: b0200
  article-title: Object-based land-cover mapping with high resolution aerial photography at a county scale in midwestern usa
  publication-title: Remote Sens.
– volume: 405
  start-page: 442
  year: 1975
  end-page: 451
  ident: b0255
  article-title: Comparison of the predicted and observed secondary structure of t4 phage lysozyme
  publication-title: Biochimica et Biophysica Acta (BBA) – Protein Structure
– reference: Ioffe, S., Szegedy, C., 2015. Batch normalization: Accelerating deep network training by reducing internal covariate shift. CoRR abs/1502.03167.
– reference: Kingma, D.P., Ba, J., 2014. Adam: A method for stochastic optimization. CoRR abs/1412.6980.
– volume: 205
  start-page: 253
  year: 2018
  ident: 10.1016/j.isprsjprs.2020.01.013_b0100
  article-title: Using landsat and nighttime lights for supervised pixel-based image classification of urban land cover
  publication-title: Remote Sens. Environ.
  doi: 10.1016/j.rse.2017.11.026
– ident: 10.1016/j.isprsjprs.2020.01.013_b0325
– ident: 10.1016/j.isprsjprs.2020.01.013_b0350
– volume: 9
  year: 2019
  ident: 10.1016/j.isprsjprs.2020.01.013_b0115
  article-title: A survey on deep learning-driven remote sensing image scene understanding: Scene classification, scene retrieval and scene-guided object detection
  publication-title: Appl. Sci.
  doi: 10.3390/app9102110
– ident: 10.1016/j.isprsjprs.2020.01.013_b0270
– volume: 405
  start-page: 442
  year: 1975
  ident: 10.1016/j.isprsjprs.2020.01.013_b0255
  article-title: Comparison of the predicted and observed secondary structure of t4 phage lysozyme
  publication-title: Biochimica et Biophysica Acta (BBA) – Protein Structure
  doi: 10.1016/0005-2795(75)90109-9
– volume: 145
  start-page: 78
  year: 2018
  ident: 10.1016/j.isprsjprs.2020.01.013_b0210
  article-title: Semantic labeling in very high resolution images via a self-cascaded convolutional neural network
  publication-title: ISPRS J. Photogramm. Remote Sens.
  doi: 10.1016/j.isprsjprs.2017.12.007
– ident: 10.1016/j.isprsjprs.2020.01.013_b0260
– ident: 10.1016/j.isprsjprs.2020.01.013_b0155
– volume: 8
  start-page: 329
  year: 2016
  ident: 10.1016/j.isprsjprs.2020.01.013_b0180
  article-title: Classification and segmentation of satellite orthoimagery using convolutional neural networks
  publication-title: Remote Sens.
  doi: 10.3390/rs8040329
– volume: 115
  start-page: 2320
  year: 2011
  ident: 10.1016/j.isprsjprs.2020.01.013_b0405
  article-title: Mapping urbanization dynamics at regional and global scales using multi-temporal dmsp/ols nighttime light data
  publication-title: Remote Sens. Environ.
  doi: 10.1016/j.rse.2011.04.032
– volume: 47
  start-page: 884
  year: 2017
  ident: 10.1016/j.isprsjprs.2020.01.013_b0230
  article-title: Joint dictionary learning for multispectral change detection
  publication-title: IEEE Trans. Cybernetics
  doi: 10.1109/TCYB.2016.2531179
– ident: 10.1016/j.isprsjprs.2020.01.013_b0315
– ident: 10.1016/j.isprsjprs.2020.01.013_b0225
– ident: 10.1016/j.isprsjprs.2020.01.013_b0380
  doi: 10.1109/ICCV.2015.164
– ident: 10.1016/j.isprsjprs.2020.01.013_b0415
  doi: 10.1109/CVPR.2017.660
– volume: 34
  start-page: 344
  year: 1986
  ident: 10.1016/j.isprsjprs.2020.01.013_b0045
  article-title: Distance transformations in digital images
  publication-title: Comput. Vision Graph. Image Process.
  doi: 10.1016/S0734-189X(86)80047-0
– ident: 10.1016/j.isprsjprs.2020.01.013_b0055
– ident: 10.1016/j.isprsjprs.2020.01.013_b0330
– ident: 10.1016/j.isprsjprs.2020.01.013_b0300
  doi: 10.1109/CVPRW.2015.7301382
– ident: 10.1016/j.isprsjprs.2020.01.013_b0145
– ident: 10.1016/j.isprsjprs.2020.01.013_b0110
– ident: 10.1016/j.isprsjprs.2020.01.013_b0135
– ident: 10.1016/j.isprsjprs.2020.01.013_b0160
– volume: 12
  year: 2017
  ident: 10.1016/j.isprsjprs.2020.01.013_b0370
  article-title: National-scale cropland mapping based on spectral-temporal features and outdated land cover information
  publication-title: PloS One
  doi: 10.1371/journal.pone.0181911
– ident: 10.1016/j.isprsjprs.2020.01.013_b0410
– volume: 132
  start-page: 48
  year: 2017
  ident: 10.1016/j.isprsjprs.2020.01.013_b0420
  article-title: Contextually guided very-high-resolution imagery classification with semantic segments
  publication-title: ISPRS J. Photogramm. Remote Sens.
  doi: 10.1016/j.isprsjprs.2017.08.011
– ident: 10.1016/j.isprsjprs.2020.01.013_b0400
  doi: 10.1109/CVPR.2018.00747
– volume: 29
  start-page: 2352
  year: 2017
  ident: 10.1016/j.isprsjprs.2020.01.013_b0310
  article-title: Deep convolutional neural networks for image classification: a comprehensive review
  publication-title: Neural Comput.
  doi: 10.1162/neco_a_00990
– volume: 88
  start-page: 303
  year: 2010
  ident: 10.1016/j.isprsjprs.2020.01.013_b0095
  article-title: The pascal visual object classes (voc) challenge
  publication-title: Int. J. Comput. Vision
  doi: 10.1007/s11263-009-0275-4
– volume: 55
  start-page: 881
  year: 2017
  ident: 10.1016/j.isprsjprs.2020.01.013_b0365
  article-title: Dense semantic labeling of subdecimeter resolution images with convolutional neural networks
  publication-title: IEEE Trans. Geosci. Remote Sens.
  doi: 10.1109/TGRS.2016.2616585
– ident: 10.1016/j.isprsjprs.2020.01.013_b0395
– ident: 10.1016/j.isprsjprs.2020.01.013_b0005
– volume: 10
  start-page: 1413
  year: 2017
  ident: 10.1016/j.isprsjprs.2020.01.013_b0375
  article-title: Semantic classification of urban trees using very high resolution satellite imagery
  publication-title: IEEE J. Sel. Top. Appl. Earth Observ. Remote Sens.
  doi: 10.1109/JSTARS.2016.2645798
– ident: 10.1016/j.isprsjprs.2020.01.013_b0170
– volume: 152
  start-page: 166
  year: 2019
  ident: 10.1016/j.isprsjprs.2020.01.013_b0235
  article-title: Deep learning in remote sensing applications: a meta-analysis and review
  publication-title: ISPRS J. Photogramm. Remote Sens.
  doi: 10.1016/j.isprsjprs.2019.04.015
– ident: 10.1016/j.isprsjprs.2020.01.013_b0085
  doi: 10.2307/1932409
– volume: 55
  start-page: 3322
  year: 2017
  ident: 10.1016/j.isprsjprs.2020.01.013_b0065
  article-title: Automatic road detection and centerline extraction via cascaded end-to-end convolutional neural network
  publication-title: IEEE Trans. Geosci. Remote Sens.
  doi: 10.1109/TGRS.2017.2669341
– ident: 10.1016/j.isprsjprs.2020.01.013_b0385
– volume: 10
  year: 2018
  ident: 10.1016/j.isprsjprs.2020.01.013_b0290
  article-title: Semantic labeling of high resolution aerial imagery and lidar data with fine segmentation network
  publication-title: Remote Sens.
  doi: 10.3390/rs10050743
– ident: 10.1016/j.isprsjprs.2020.01.013_b0425
  doi: 10.1109/ICCV.2017.244
– volume: 22
  start-page: 1345
  year: 2010
  ident: 10.1016/j.isprsjprs.2020.01.013_b0285
  article-title: A survey on transfer learning
  publication-title: IEEE Trans. Knowl. Data Eng.
  doi: 10.1109/TKDE.2009.191
– ident: 10.1016/j.isprsjprs.2020.01.013_b0080
  doi: 10.1109/CVPR.2009.5206848
– ident: 10.1016/j.isprsjprs.2020.01.013_b0050
– ident: 10.1016/j.isprsjprs.2020.01.013_b0090
  doi: 10.1007/978-3-319-46976-8_19
– volume: 135
  start-page: 158
  year: 2018
  ident: 10.1016/j.isprsjprs.2020.01.013_b0240
  article-title: Classification with an edge: Improving semantic image segmentation with boundary detection
  publication-title: ISPRS J. Photogramm. Remote Sens.
  doi: 10.1016/j.isprsjprs.2017.11.009
– volume: 87
  start-page: 180
  year: 2014
  ident: 10.1016/j.isprsjprs.2020.01.013_b0040
  article-title: Geographic object-based image analysis–towards a new paradigm
  publication-title: ISPRS J. Photogramm. Remote Sens.
  doi: 10.1016/j.isprsjprs.2013.09.014
– volume: 6
  start-page: 11372
  year: 2014
  ident: 10.1016/j.isprsjprs.2020.01.013_b0200
  article-title: Object-based land-cover mapping with high resolution aerial photography at a county scale in midwestern usa
  publication-title: Remote Sens.
  doi: 10.3390/rs61111372
– year: 2016
  ident: 10.1016/j.isprsjprs.2020.01.013_b0275
  article-title: Deconvolution and checkerboard artifacts
  publication-title: Distill
  doi: 10.23915/distill.00003
– ident: 10.1016/j.isprsjprs.2020.01.013_b0355
  doi: 10.1117/12.586823
– ident: 10.1016/j.isprsjprs.2020.01.013_b0345
  doi: 10.1007/978-3-319-67558-9_28
– ident: 10.1016/j.isprsjprs.2020.01.013_b0130
– ident: 10.1016/j.isprsjprs.2020.01.013_b0245
  doi: 10.1109/IGARSS.2017.8128165
– volume: 115
  start-page: 1145
  year: 2011
  ident: 10.1016/j.isprsjprs.2020.01.013_b0265
  article-title: Per-pixel vs. object-based classification of urban land cover extraction using high spatial resolution imagery
  publication-title: Remote Sens. Environ.
  doi: 10.1016/j.rse.2010.12.017
– ident: 10.1016/j.isprsjprs.2020.01.013_b0205
  doi: 10.1109/ICCV.2017.324
– ident: 10.1016/j.isprsjprs.2020.01.013_b0060
– ident: 10.1016/j.isprsjprs.2020.01.013_b0025
– ident: 10.1016/j.isprsjprs.2020.01.013_b0195
– volume: 140
  start-page: 20
  year: 2018
  ident: 10.1016/j.isprsjprs.2020.01.013_b0010
  article-title: Beyond rgb: Very high resolution urban remote sensing with multimodal deep networks
  publication-title: ISPRS J. Photogramm. Remote Sens.
  doi: 10.1016/j.isprsjprs.2017.11.011
– ident: 10.1016/j.isprsjprs.2020.01.013_b0120
– volume: 5
  start-page: 1
  year: 1948
  ident: 10.1016/j.isprsjprs.2020.01.013_b0340
  article-title: A method of establishing groups of equal amplitude in plant sociology based on similarity of species and its application to analyses of the vegetation on Danish commons
  publication-title: Biol. Skr.
– ident: 10.1016/j.isprsjprs.2020.01.013_b0105
– ident: 10.1016/j.isprsjprs.2020.01.013_b0030
– start-page: 18
  year: 2018
  ident: 10.1016/j.isprsjprs.2020.01.013_b0295
  article-title: High-resolution aerial imagery semantic labeling with dense pyramid network
  publication-title: Sensors
– ident: 10.1016/j.isprsjprs.2020.01.013_b0320
– ident: 10.1016/j.isprsjprs.2020.01.013_b0020
– volume: 9
  year: 2017
  ident: 10.1016/j.isprsjprs.2020.01.013_b0215
  article-title: Hourglass-shapenetwork based semantic segmentation for high resolution aerial imagery
  publication-title: Remote Sens.
– ident: 10.1016/j.isprsjprs.2020.01.013_b0220
  doi: 10.1109/CVPRW.2017.200
– volume: 9
  year: 2017
  ident: 10.1016/j.isprsjprs.2020.01.013_b0015
  article-title: Segment-before-detect: vehicle detection and classification through semantic segmentation of aerial images
  publication-title: Remote Sens.
  doi: 10.3390/rs9040368
– volume: 25
  start-page: 1451
  year: 2006
  ident: 10.1016/j.isprsjprs.2020.01.013_b0075
  article-title: Generalized overlap measures for evaluation and validation in medical image analysis
  publication-title: IEEE Trans. Med. Imaging
  doi: 10.1109/TMI.2006.880587
– ident: 10.1016/j.isprsjprs.2020.01.013_b0150
– volume: 10
  year: 2018
  ident: 10.1016/j.isprsjprs.2020.01.013_b0390
  article-title: Building extraction in very high resolution imagery by dense-attention networks
  publication-title: Remote Sens.
  doi: 10.3390/rs10111768
– volume: 8
  start-page: 232
  year: 2016
  ident: 10.1016/j.isprsjprs.2020.01.013_b0175
  article-title: Cropland mapping over sahelian and sudanian agrosystems: a knowledge-based approach using proba-v time series at 100-m
  publication-title: Remote Sens.
  doi: 10.3390/rs8030232
– volume: 3
  start-page: 1777
  year: 2011
  ident: 10.1016/j.isprsjprs.2020.01.013_b0250
  article-title: Segment-based land cover mapping of a suburban areacomparison of high-resolution remotely sensed datasets using classification trees and test field points
  publication-title: Remote Sens.
  doi: 10.3390/rs3081777
– start-page: 10
  year: 2018
  ident: 10.1016/j.isprsjprs.2020.01.013_b0305
  article-title: Supervised classification of multisensor remotely sensed images using a deep learning framework
  publication-title: Remote Sens.
– ident: 10.1016/j.isprsjprs.2020.01.013_b0335
– volume: 1
  start-page: 541
  year: 1989
  ident: 10.1016/j.isprsjprs.2020.01.013_b0185
  article-title: Backpropagation applied to handwritten zip code recognition
  publication-title: Neural Comput.
  doi: 10.1162/neco.1989.1.4.541
– start-page: 583
  year: 1991
  ident: 10.1016/j.isprsjprs.2020.01.013_b0360
  article-title: Watersheds in digital spaces: an efficient algorithm based on immersion simulations
  publication-title: IEEE Trans. Pattern Anal. Mach. Intell.
  doi: 10.1109/34.87344
– volume: 5
  start-page: 8
  year: 2017
  ident: 10.1016/j.isprsjprs.2020.01.013_b0430
  article-title: Deep learning in remote sensing: a comprehensive review and list of resources
  publication-title: IEEE Geosci. Remote Sens. Mag.
  doi: 10.1109/MGRS.2017.2762307
– volume: 24
  start-page: 603
  year: 2002
  ident: 10.1016/j.isprsjprs.2020.01.013_b0070
  article-title: Mean shift: a robust approach toward feature space analysis
  publication-title: IEEE Trans. Pattern Anal. Mach. Intell.
  doi: 10.1109/34.1000236
– ident: 10.1016/j.isprsjprs.2020.01.013_b0165
– volume: 53
  start-page: 4483
  year: 2015
  ident: 10.1016/j.isprsjprs.2020.01.013_b0190
  article-title: Robust rooftop extraction from visible band images using higher order crf
  publication-title: IEEE Trans. Geosci. Remote Sens.
  doi: 10.1109/TGRS.2015.2400462
– ident: 10.1016/j.isprsjprs.2020.01.013_b0140
– ident: 10.1016/j.isprsjprs.2020.01.013_b0035
– ident: 10.1016/j.isprsjprs.2020.01.013_b0125
  doi: 10.1109/ICCV.2017.322
– volume: 9
  start-page: 2868
  year: 2016
  ident: 10.1016/j.isprsjprs.2020.01.013_b0280
  article-title: Semantic labeling of aerial and satellite imagery
  publication-title: IEEE J. Sel. Top. Appl. Earth Observ. Remote Sens.
  doi: 10.1109/JSTARS.2016.2582921
SSID ssj0001568
Score 2.7144868
Snippet Scene understanding of high resolution aerial images is of great importance for the task of automated monitoring in various remote sensing applications. Due to...
SourceID proquest
crossref
elsevier
SourceType Aggregation Database
Enrichment Source
Index Database
Publisher
StartPage 94
SubjectTerms aerial photography
Architecture
automation
color
Convolutional neural network
Data augmentation
data collection
flavor
Loss function
monitoring
neural networks
remote sensing
variance
Very high spatial resolution
Title ResUNet-a: A deep learning framework for semantic segmentation of remotely sensed data
URI https://dx.doi.org/10.1016/j.isprsjprs.2020.01.013
https://www.proquest.com/docview/2400520889
Volume 162
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwpV1Lb9RADLaqcuAhIVhAlEc1SFyHzTw2mfS2qqgWEHsAFvUWzcNpt2qzq8320Au_HU8yKS0S6gEpkZKJR4lsx2NrPtsA7500lqyc40HljutSFtyWQnNrvSm1K4MIMVD8Os9nC_35eHK8A4dDLkyEVSbb39v0zlqnkXHi5ni9XI6_ZxQ6yFgAifQ0Ovoxg10XUcs__PoD8xB9Olwk5pH6FsZr2a437RmdFCjKrKvfKdS_Vqi_bHW3AB09gcfJc2TT_uOewg42I3g0vbERMIKHN8oLjuB-6nB-evUMfn7DdjHHLbcHbMoC4pqlhhEnrB4AWow8WNbiBXF76eni5CJlJjVsVbMNkljx_IoeNC0GFsGlz2Fx9PHH4YynngrcK222HP3ECoFFEWqDLsvRB4H0Txe6FrKWrrA0WOBEB-MEZkQtvTKOAkNniLtWvYDdZtXgS2CyRKxL8iBqlWt0BXmapddeBSVJ5BL3IB_4WPlUcDz2vTivBmTZWXUtgCoKoMoEHWoPsuuJ677mxt1TDgZBVbfUp6KV4e7J7wbRVvRzxR0T2-Dqkoh0hxMypnz1Py94DQ_iXQ_4eQO7280lviVfZuv2O2Xdh3vTT19m89-4QfXg
linkProvider Elsevier
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwpV3da9swED-69KFbYWzZxtp9abBXE-sjtty3UFbStc3D1oy-CUs6tymtE-L0of_9TrEc2sHow8AGI-mQuTud7tBPdwDfrNAlWTmbeJnZRBUiT8qCq6QsnS6ULTz3IVA8m2TjqfpxMbzYgsPuLkyAVUbb39r0tbWOLYPIzcFiNhv8Sil0ECEBEulpcPSfwXbITjXswfbo-GQ82Rhk3t6IC-OTQPAI5jVrFsvmml6KFUW6TuHJ5b82qb_M9XoPOnoFL6PzyEbt_72GLaz7sDt6cBbQhxcPMgz2YScWOb-6fwO_f2IzneAqKQ_YiHnEBYs1Iy5Z1WG0GDmxrMFbYvjM0cflbbycVLN5xZZIksWbe-qoG_Qs4EvfwvTo-_nhOIllFRInlV4l6IYl55jnvtJo0wyd50jLOlcVF5WweUmNOQ6V15ZjSqOFk9pSbGi1Unkp30Gvntf4HpgoEKuCnIhKZgptTs5m4ZSTXgqSusA9yDo-GhdzjofSFzemA5ddm40ATBCASTk9cg_SDeGiTbvxNMlBJyjzSIMMbQ5PE3_tRGtofYVDk7LG-R0NUmuokNbF_v9M8AV2xudnp-b0eHLyAZ6Hnhb_8xF6q-UdfiLXZmU_R9X9Ax94-JE
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=ResUNet-a%3A+A+deep+learning+framework+for+semantic+segmentation+of+remotely+sensed+data&rft.jtitle=ISPRS+journal+of+photogrammetry+and+remote+sensing&rft.au=Diakogiannis%2C+Foivos+I.&rft.au=Waldner%2C+Fran%C3%A7ois&rft.au=Caccetta%2C+Peter&rft.au=Wu%2C+Chen&rft.date=2020-04-01&rft.issn=0924-2716&rft.volume=162&rft.spage=94&rft.epage=114&rft_id=info:doi/10.1016%2Fj.isprsjprs.2020.01.013&rft.externalDBID=n%2Fa&rft.externalDocID=10_1016_j_isprsjprs_2020_01_013
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0924-2716&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0924-2716&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0924-2716&client=summon