Semantic Segmentation of Urban Buildings from VHR Remote Sensing Imagery Using a Deep Convolutional Neural Network
Urban building segmentation is a prevalent research domain for very high resolution (VHR) remote sensing; however, various appearances and complicated background of VHR remote sensing imagery make accurate semantic segmentation of urban buildings a challenge in relevant applications. Following the b...
Saved in:
Published in | Remote sensing (Basel, Switzerland) Vol. 11; no. 15; p. 1774 |
---|---|
Main Authors | , , , , , |
Format | Journal Article |
Language | English |
Published |
Basel
MDPI AG
28.07.2019
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Abstract | Urban building segmentation is a prevalent research domain for very high resolution (VHR) remote sensing; however, various appearances and complicated background of VHR remote sensing imagery make accurate semantic segmentation of urban buildings a challenge in relevant applications. Following the basic architecture of U-Net, an end-to-end deep convolutional neural network (denoted as DeepResUnet) was proposed, which can effectively perform urban building segmentation at pixel scale from VHR imagery and generate accurate segmentation results. The method contains two sub-networks: One is a cascade down-sampling network for extracting feature maps of buildings from the VHR image, and the other is an up-sampling network for reconstructing those extracted feature maps back to the same size of the input VHR image. The deep residual learning approach was adopted to facilitate training in order to alleviate the degradation problem that often occurred in the model training process. The proposed DeepResUnet was tested with aerial images with a spatial resolution of 0.075 m and was compared in performance under the exact same conditions with six other state-of-the-art networks—FCN-8s, SegNet, DeconvNet, U-Net, ResUNet and DeepUNet. Results of extensive experiments indicated that the proposed DeepResUnet outperformed the other six existing networks in semantic segmentation of urban buildings in terms of visual and quantitative evaluation, especially in labeling irregular-shape and small-size buildings with higher accuracy and entirety. Compared with the U-Net, the F1 score, Kappa coefficient and overall accuracy of DeepResUnet were improved by 3.52%, 4.67% and 1.72%, respectively. Moreover, the proposed DeepResUnet required much fewer parameters than the U-Net, highlighting its significant improvement among U-Net applications. Nevertheless, the inference time of DeepResUnet is slightly longer than that of the U-Net, which is subject to further improvement. |
---|---|
AbstractList | Urban building segmentation is a prevalent research domain for very high resolution (VHR) remote sensing; however, various appearances and complicated background of VHR remote sensing imagery make accurate semantic segmentation of urban buildings a challenge in relevant applications. Following the basic architecture of U-Net, an end-to-end deep convolutional neural network (denoted as DeepResUnet) was proposed, which can effectively perform urban building segmentation at pixel scale from VHR imagery and generate accurate segmentation results. The method contains two sub-networks: One is a cascade down-sampling network for extracting feature maps of buildings from the VHR image, and the other is an up-sampling network for reconstructing those extracted feature maps back to the same size of the input VHR image. The deep residual learning approach was adopted to facilitate training in order to alleviate the degradation problem that often occurred in the model training process. The proposed DeepResUnet was tested with aerial images with a spatial resolution of 0.075 m and was compared in performance under the exact same conditions with six other state-of-the-art networks—FCN-8s, SegNet, DeconvNet, U-Net, ResUNet and DeepUNet. Results of extensive experiments indicated that the proposed DeepResUnet outperformed the other six existing networks in semantic segmentation of urban buildings in terms of visual and quantitative evaluation, especially in labeling irregular-shape and small-size buildings with higher accuracy and entirety. Compared with the U-Net, the F1 score, Kappa coefficient and overall accuracy of DeepResUnet were improved by 3.52%, 4.67% and 1.72%, respectively. Moreover, the proposed DeepResUnet required much fewer parameters than the U-Net, highlighting its significant improvement among U-Net applications. Nevertheless, the inference time of DeepResUnet is slightly longer than that of the U-Net, which is subject to further improvement. |
Author | Zhang, Wanchang Li, Weidong Zhao, Tian Yi, Yaning Zhang, Chuanrong Zhang, Zhijie |
Author_xml | – sequence: 1 givenname: Yaning orcidid: 0000-0002-2653-8920 surname: Yi fullname: Yi, Yaning – sequence: 2 givenname: Zhijie orcidid: 0000-0002-7276-5649 surname: Zhang fullname: Zhang, Zhijie – sequence: 3 givenname: Wanchang orcidid: 0000-0002-2607-4628 surname: Zhang fullname: Zhang, Wanchang – sequence: 4 givenname: Chuanrong orcidid: 0000-0002-9165-5584 surname: Zhang fullname: Zhang, Chuanrong – sequence: 5 givenname: Weidong orcidid: 0000-0002-4558-3292 surname: Li fullname: Li, Weidong – sequence: 6 givenname: Tian surname: Zhao fullname: Zhao, Tian |
BookMark | eNptkd9rFDEQx4NUsNa--BcEfBHhNL82yT7qVduD0kLr-Rqy2cmRczc5k12l_33TO9FSOi8zyXzyJTPf1-gopggIvaXkI-ct-ZQLpbShSokX6JgRxRaCtezoUf0KnZayJTU4py0RxyjfwmjjFBy-hc0IcbJTSBEnj9e5sxF_mcPQh7gp2Oc04h8XN_gGxjRB5WOpDbwa7QbyHV7vTxafAezwMsXfaZgftOyAr2DO-zT9SfnnG_TS26HA6d98gtbfvn5fXiwur89Xy8-XCyckmRZa8pYJzbjWtieizkVkQ1vNlfSKaiUo142U3hMmBDgNrXaKdT3pOLGOO36CVgfdPtmt2eUw2nxnkg1mf5HyxthcJx_AiF41UkjlLemF6DrteON7L8F1znKuq9b7g9Yup18zlMmMoTgYBhshzcWwVkvWSNWIir57gm7TnOsaKsWJIKIljFXqw4FyOZWSwf_7ICXmwU3z380KkyewCwejpmzD8NyTe9OCoZo |
CitedBy_id | crossref_primary_10_1016_j_eswa_2022_118380 crossref_primary_10_3390_rs14092252 crossref_primary_10_1109_JSTARS_2022_3189277 crossref_primary_10_3390_ijgi9080486 crossref_primary_10_1080_01431161_2021_1903613 crossref_primary_10_3390_mi13111920 crossref_primary_10_3390_rs13193898 crossref_primary_10_3390_s21051794 crossref_primary_10_1080_15481603_2024_2356355 crossref_primary_10_3390_rs13173414 crossref_primary_10_1109_ACCESS_2024_3519260 crossref_primary_10_7717_peerj_cs_2006 crossref_primary_10_1002_ima_22819 crossref_primary_10_1016_j_ecoinf_2021_101430 crossref_primary_10_1016_j_isprsjprs_2024_03_012 crossref_primary_10_3390_rs15184455 crossref_primary_10_3390_rs13163211 crossref_primary_10_1142_S0218213023500331 crossref_primary_10_3390_s23073643 crossref_primary_10_1007_s13369_022_06768_8 crossref_primary_10_3390_s24113677 crossref_primary_10_3390_s24020365 crossref_primary_10_3390_rs13214441 crossref_primary_10_1016_j_jag_2023_103632 crossref_primary_10_3390_rs14236057 crossref_primary_10_1016_j_compenvurbsys_2024_102075 crossref_primary_10_3390_rs13183600 crossref_primary_10_3390_data7040045 crossref_primary_10_1016_j_rse_2023_113856 crossref_primary_10_3390_app14104075 crossref_primary_10_3390_s22176425 crossref_primary_10_61186_jgit_11_3_43 crossref_primary_10_1080_14498596_2022_2037473 crossref_primary_10_1016_j_isprsjprs_2023_05_013 crossref_primary_10_1109_LGRS_2022_3227392 crossref_primary_10_3390_rs12030549 crossref_primary_10_1111_tgis_13133 crossref_primary_10_3390_rs15092464 crossref_primary_10_3390_rs14092237 crossref_primary_10_3390_rs12244145 crossref_primary_10_3390_rs13020294 crossref_primary_10_3390_rs15153766 crossref_primary_10_3390_rs14051128 crossref_primary_10_3390_rs12244149 crossref_primary_10_3390_rs14020269 crossref_primary_10_3390_rs16050818 crossref_primary_10_1109_JSTARS_2020_3043442 crossref_primary_10_3390_rs16010169 crossref_primary_10_1080_15481603_2023_2165256 crossref_primary_10_3390_rs12213603 crossref_primary_10_3390_f12091202 crossref_primary_10_3390_rs12020207 crossref_primary_10_3390_rs16060979 crossref_primary_10_1038_s41598_024_70019_z crossref_primary_10_3390_f13122170 crossref_primary_10_3390_rs14112611 crossref_primary_10_3390_rs16152776 crossref_primary_10_3390_electronics10232970 crossref_primary_10_1016_j_compag_2022_106873 crossref_primary_10_1109_JSTARS_2020_3028855 crossref_primary_10_1007_s10489_021_02542_9 crossref_primary_10_3390_electronics12040881 crossref_primary_10_3390_rs13071312 crossref_primary_10_1007_s44196_023_00364_w crossref_primary_10_1109_JSTARS_2024_3381737 crossref_primary_10_17587_mau_22_48_55 crossref_primary_10_1109_ACCESS_2021_3069882 crossref_primary_10_1109_JSTARS_2022_3178470 crossref_primary_10_3390_rs12182932 crossref_primary_10_1109_JSTARS_2022_3230625 crossref_primary_10_7780_kjrs_2025_41_1_15 crossref_primary_10_1007_s12145_024_01267_w crossref_primary_10_1109_JSTARS_2021_3104726 crossref_primary_10_1134_S0001433820120427 crossref_primary_10_3390_rs15204896 crossref_primary_10_1016_j_isprsjprs_2022_11_006 crossref_primary_10_1109_JSTARS_2023_3270302 crossref_primary_10_3390_app14177499 crossref_primary_10_1007_s11042_025_20676_7 crossref_primary_10_1016_j_rsase_2022_100898 crossref_primary_10_1016_j_cmpb_2020_105727 crossref_primary_10_1016_j_rsase_2021_100537 crossref_primary_10_1109_JSTARS_2021_3071353 crossref_primary_10_3389_frsen_2025_1538808 crossref_primary_10_3390_rs16203864 crossref_primary_10_3390_sym14050960 crossref_primary_10_1007_s11760_022_02383_0 crossref_primary_10_1109_JSTARS_2023_3328315 crossref_primary_10_1109_ACCESS_2020_3003914 crossref_primary_10_3390_rs14040965 crossref_primary_10_1016_j_jag_2022_102748 crossref_primary_10_3390_rs13030440 crossref_primary_10_3390_rs13040760 crossref_primary_10_3390_ijgi10010023 crossref_primary_10_1080_17538947_2023_2230956 crossref_primary_10_3390_s20185292 crossref_primary_10_1109_ACCESS_2022_3194919 crossref_primary_10_14358_PERS_21_00076R2 crossref_primary_10_3390_rs13010039 crossref_primary_10_3390_app13169239 crossref_primary_10_3390_rs12162576 crossref_primary_10_7780_kjrs_2024_40_6_1_2 crossref_primary_10_3390_rs15020488 crossref_primary_10_3390_rs15102689 crossref_primary_10_1080_01431161_2022_2135413 crossref_primary_10_3390_rs13040808 crossref_primary_10_1007_s11042_022_11948_7 crossref_primary_10_1109_JSTARS_2023_3339294 crossref_primary_10_3390_rs13132457 crossref_primary_10_1080_15481603_2022_2143678 crossref_primary_10_3390_rs14194872 crossref_primary_10_1145_3469661 crossref_primary_10_1109_ACCESS_2022_3231362 crossref_primary_10_3390_rs13163083 crossref_primary_10_3390_s23083805 crossref_primary_10_1016_j_eclinm_2023_102270 crossref_primary_10_1109_TGRS_2021_3093004 crossref_primary_10_3390_rs13163087 crossref_primary_10_7780_kjrs_2024_40_6_1_23 crossref_primary_10_1016_j_procs_2023_08_201 crossref_primary_10_3390_rs13214411 crossref_primary_10_1016_j_catena_2020_104851 crossref_primary_10_1109_JSTSP_2022_3159032 crossref_primary_10_1016_j_heliyon_2024_e38141 crossref_primary_10_1007_s11063_021_10592_w crossref_primary_10_1093_nsr_nwac290 crossref_primary_10_3390_rs15082046 crossref_primary_10_1109_JSTARS_2021_3079459 crossref_primary_10_3390_rs12050852 crossref_primary_10_3390_rs12091515 crossref_primary_10_3390_rs14215527 crossref_primary_10_3390_rs11242912 crossref_primary_10_1016_j_neucom_2021_10_076 crossref_primary_10_1155_2022_4992547 crossref_primary_10_3390_rs12101574 crossref_primary_10_3390_rs12132159 crossref_primary_10_1109_JSTARS_2024_3388464 crossref_primary_10_3390_rs16152851 crossref_primary_10_1080_01431161_2023_2275326 crossref_primary_10_1016_j_ocecoaman_2022_106381 crossref_primary_10_3390_rs12081289 crossref_primary_10_1080_17538947_2023_2177359 crossref_primary_10_1007_s11069_022_05612_4 crossref_primary_10_1007_s11442_025_2339_y crossref_primary_10_1080_10106049_2021_1943009 crossref_primary_10_3390_drones6070175 crossref_primary_10_3390_rs16132504 crossref_primary_10_1016_j_energy_2023_130202 crossref_primary_10_1016_j_cviu_2024_104253 crossref_primary_10_3390_rs15081996 crossref_primary_10_54751_revistafoco_v16n7_016 crossref_primary_10_1109_TGRS_2021_3133109 crossref_primary_10_3390_rs14122745 crossref_primary_10_3390_rs14194889 crossref_primary_10_1007_s13369_023_08593_z crossref_primary_10_3390_geomatics3010007 crossref_primary_10_1080_19475705_2022_2030414 |
Cites_doi | 10.1016/j.compenvurbsys.2017.03.001 10.1007/978-3-319-24574-4_28 10.1109/CVPR.2015.7298594 10.3390/rs9050446 10.1109/JSTARS.2018.2825099 10.1080/01431161.2016.1266108 10.3390/rs3081777 10.20944/preprints201608.0022.v1 10.1109/TGRS.2016.2616585 10.1016/j.isprsjprs.2019.02.019 10.1109/CVPR.2017.243 10.1109/ICCV.2015.169 10.1109/CVPR.2015.7298965 10.3390/rs10020236 10.1016/j.rse.2018.04.050 10.1109/LGRS.2018.2822760 10.3390/rs8110954 10.1117/1.JRS.11.042609 10.1016/j.rse.2012.03.013 10.1109/JSTARS.2018.2849363 10.1109/LGRS.2018.2802944 10.1109/IGARSS.2015.7326158 10.1109/JSTARS.2017.2747599 10.1109/TPAMI.2016.2644615 10.3390/rs11080917 10.1109/TITS.2016.2622280 10.3390/rs8030258 10.1109/ICCV.2015.178 10.1016/j.isprsjprs.2017.11.011 10.3390/rs10010052 10.14358/PERS.70.12.1365 10.1109/MGRS.2016.2540798 10.1007/978-3-319-46448-0_2 10.2352/J.ImagingSci.Technol.2016.60.1.010402 10.1109/CVPRW.2017.156 10.1109/CVPR.2015.7299173 10.1007/978-3-030-01440-7_15 10.1016/j.isprsjprs.2017.12.007 10.3390/rs10010144 10.1109/TGRS.2016.2542342 10.1109/TPAMI.2017.2699184 10.1109/LGRS.2018.2795531 10.1016/j.isprsjprs.2017.11.009 10.1109/JSTARS.2018.2833382 10.1109/TGRS.2018.2837142 10.3390/rs10030407 10.1109/TGRS.2011.2136381 10.1109/TPAMI.2017.2750680 10.1109/CVPR.2016.90 10.1109/JPROC.2012.2211551 10.1016/j.isprsjprs.2016.10.010 10.1109/LGRS.2014.2309695 10.1109/ICUS.2017.8278309 10.1109/TGRS.2018.2858817 |
ContentType | Journal Article |
Copyright | 2019. This work is licensed under https://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. |
Copyright_xml | – notice: 2019. This work is licensed under https://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. |
DBID | AAYXX CITATION 7QF 7QO 7QQ 7QR 7SC 7SE 7SN 7SP 7SR 7TA 7TB 7U5 8BQ 8FD 8FE 8FG ABJCF ABUWG AFKRA ARAPS AZQEC BENPR BGLVJ BHPHI BKSAR C1K CCPQU DWQXO F28 FR3 H8D H8G HCIFZ JG9 JQ2 KR7 L6V L7M L~C L~D M7S P5Z P62 P64 PCBAR PHGZM PHGZT PIMPY PKEHL PQEST PQGLB PQQKQ PQUKI PTHSS 7S9 L.6 DOA |
DOI | 10.3390/rs11151774 |
DatabaseName | CrossRef Aluminium Industry Abstracts Biotechnology Research Abstracts Ceramic Abstracts Chemoreception Abstracts Computer and Information Systems Abstracts Corrosion Abstracts Ecology Abstracts Electronics & Communications Abstracts Engineered Materials Abstracts Materials Business File Mechanical & Transportation Engineering Abstracts Solid State and Superconductivity Abstracts METADEX Technology Research Database ProQuest SciTech Collection ProQuest Technology Collection Materials Science & Engineering Collection ProQuest Central (Alumni) ProQuest Central UK/Ireland Advanced Technologies & Aerospace Collection ProQuest Central Essentials ProQuest Central Technology Collection Natural Science Collection Earth, Atmospheric & Aquatic Science Collection Environmental Sciences and Pollution Management ProQuest One Community College ProQuest Central Korea ANTE: Abstracts in New Technology & Engineering Engineering Research Database Aerospace Database Copper Technical Reference Library SciTech Premium Collection Materials Research Database ProQuest Computer Science Collection Civil Engineering Abstracts ProQuest Engineering Collection Advanced Technologies Database with Aerospace Computer and Information Systems Abstracts Academic Computer and Information Systems Abstracts Professional Engineering Database Advanced Technologies & Aerospace Database ProQuest Advanced Technologies & Aerospace Collection Biotechnology and BioEngineering Abstracts Earth, Atmospheric & Aquatic Science Database ProQuest Central Premium ProQuest One Academic Publicly Available Content Database ProQuest One Academic Middle East (New) ProQuest One Academic Eastern Edition (DO NOT USE) ProQuest One Applied & Life Sciences ProQuest One Academic ProQuest One Academic UKI Edition Engineering Collection AGRICOLA AGRICOLA - Academic DOAJ Directory of Open Access Journals |
DatabaseTitle | CrossRef Publicly Available Content Database Materials Research Database ProQuest Advanced Technologies & Aerospace Collection ProQuest Central Essentials ProQuest Computer Science Collection Computer and Information Systems Abstracts SciTech Premium Collection Materials Business File Environmental Sciences and Pollution Management ProQuest One Applied & Life Sciences Engineered Materials Abstracts Natural Science Collection Chemoreception Abstracts ProQuest Central (New) Engineering Collection ANTE: Abstracts in New Technology & Engineering Advanced Technologies & Aerospace Collection Engineering Database Aluminium Industry Abstracts ProQuest One Academic Eastern Edition Electronics & Communications Abstracts Earth, Atmospheric & Aquatic Science Database ProQuest Technology Collection Ceramic Abstracts Ecology Abstracts Biotechnology and BioEngineering Abstracts ProQuest One Academic UKI Edition Solid State and Superconductivity Abstracts Engineering Research Database ProQuest One Academic ProQuest One Academic (New) Technology Collection Technology Research Database Computer and Information Systems Abstracts – Academic ProQuest One Academic Middle East (New) Mechanical & Transportation Engineering Abstracts ProQuest Central (Alumni Edition) ProQuest One Community College Earth, Atmospheric & Aquatic Science Collection ProQuest Central Aerospace Database Copper Technical Reference Library ProQuest Engineering Collection Biotechnology Research Abstracts ProQuest Central Korea Advanced Technologies Database with Aerospace Civil Engineering Abstracts ProQuest SciTech Collection METADEX Computer and Information Systems Abstracts Professional Advanced Technologies & Aerospace Database Materials Science & Engineering Collection Corrosion Abstracts AGRICOLA AGRICOLA - Academic |
DatabaseTitleList | AGRICOLA Publicly Available Content Database CrossRef |
Database_xml | – sequence: 1 dbid: DOA name: DOAJ Directory of Open Access Journals url: https://www.doaj.org/ sourceTypes: Open Website – sequence: 2 dbid: 8FG name: ProQuest Technology Collection url: https://search.proquest.com/technologycollection1 sourceTypes: Aggregation Database |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Geography |
EISSN | 2072-4292 |
ExternalDocumentID | oai_doaj_org_article_4d756467fa0d44bb8c35fdf6ecbca338 10_3390_rs11151774 |
GeographicLocations | Beijing China United States--US China |
GeographicLocations_xml | – name: China – name: Beijing China – name: United States--US |
GroupedDBID | 29P 2WC 5VS 8FE 8FG 8FH AADQD AAHBH AAYXX ABDBF ABJCF ACUHS ADBBV ADMLS AENEX AFKRA AFZYC ALMA_UNASSIGNED_HOLDINGS ARAPS BCNDV BENPR BGLVJ BHPHI BKSAR CCPQU CITATION E3Z ESX FRP GROUPED_DOAJ HCIFZ I-F KQ8 L6V LK5 M7R M7S MODMG M~E OK1 P62 PCBAR PHGZM PHGZT PIMPY PROAC PTHSS TR2 TUS 7QF 7QO 7QQ 7QR 7SC 7SE 7SN 7SP 7SR 7TA 7TB 7U5 8BQ 8FD ABUWG AZQEC C1K DWQXO F28 FR3 H8D H8G JG9 JQ2 KR7 L7M L~C L~D P64 PKEHL PQEST PQGLB PQQKQ PQUKI 7S9 L.6 PUEGO |
ID | FETCH-LOGICAL-c460t-86392482388ad04517065198376f71874138566ff0244ec8e98c72bd0b30ac3c3 |
IEDL.DBID | DOA |
ISSN | 2072-4292 |
IngestDate | Wed Aug 27 01:31:36 EDT 2025 Thu Jul 10 20:16:44 EDT 2025 Fri Jul 25 12:14:05 EDT 2025 Tue Jul 01 04:14:49 EDT 2025 Thu Apr 24 22:57:53 EDT 2025 |
IsDoiOpenAccess | true |
IsOpenAccess | true |
IsPeerReviewed | true |
IsScholarly | true |
Issue | 15 |
Language | English |
License | https://creativecommons.org/licenses/by/4.0 |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-c460t-86392482388ad04517065198376f71874138566ff0244ec8e98c72bd0b30ac3c3 |
Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 content type line 23 |
ORCID | 0000-0002-2653-8920 0000-0002-4558-3292 0000-0002-2607-4628 0000-0002-7276-5649 0000-0002-9165-5584 |
OpenAccessLink | https://doaj.org/article/4d756467fa0d44bb8c35fdf6ecbca338 |
PQID | 2304049022 |
PQPubID | 2032338 |
ParticipantIDs | doaj_primary_oai_doaj_org_article_4d756467fa0d44bb8c35fdf6ecbca338 proquest_miscellaneous_2986256754 proquest_journals_2304049022 crossref_primary_10_3390_rs11151774 crossref_citationtrail_10_3390_rs11151774 |
ProviderPackageCode | CITATION AAYXX |
PublicationCentury | 2000 |
PublicationDate | 20190728 |
PublicationDateYYYYMMDD | 2019-07-28 |
PublicationDate_xml | – month: 07 year: 2019 text: 20190728 day: 28 |
PublicationDecade | 2010 |
PublicationPlace | Basel |
PublicationPlace_xml | – name: Basel |
PublicationTitle | Remote sensing (Basel, Switzerland) |
PublicationYear | 2019 |
Publisher | MDPI AG |
Publisher_xml | – name: MDPI AG |
References | Marmanis (ref_4) 2018; 135 ref_58 Shu (ref_8) 2018; 15 Bittner (ref_42) 2018; 11 ref_54 Zhong (ref_36) 2016; 54 Yuan (ref_39) 2018; 40 ref_53 ref_52 Grinias (ref_1) 2016; 122 ref_18 Zhou (ref_6) 2017; 18 ref_59 Badrinarayanan (ref_29) 2017; 39 Erener (ref_3) 2013; 21 ref_61 Saito (ref_41) 2016; 60 ref_60 Audebert (ref_64) 2018; 140 Ball (ref_15) 2017; 11 ref_25 ref_24 ref_23 ref_21 ref_65 Yousefi (ref_17) 2014; 30 ref_20 Zhang (ref_51) 2018; 15 ref_62 Li (ref_57) 2018; 11 Liu (ref_50) 2017; 145 Matikainen (ref_11) 2011; 3 ref_28 ref_27 ref_26 Zhang (ref_32) 2016; 4 Ma (ref_37) 2018; 56 Volpi (ref_56) 2017; 55 ref_33 Huang (ref_10) 2018; 214 ref_31 Ji (ref_40) 2019; 57 ref_30 Dalponte (ref_13) 2012; 123 Moser (ref_9) 2013; 101 ref_38 Song (ref_19) 2004; 70 Chen (ref_55) 2018; 40 Sun (ref_63) 2018; 15 Zhang (ref_12) 2017; 64 Chen (ref_35) 2014; 11 ref_46 ref_45 ref_44 ref_43 Huang (ref_66) 2019; 151 Das (ref_22) 2011; 49 Schilling (ref_34) 2018; 11 ref_2 ref_49 Cheng (ref_47) 2017; 10 ref_48 Turker (ref_16) 2015; 34 ref_5 ref_7 Meave (ref_14) 2017; 38 |
References_xml | – volume: 64 start-page: 215 year: 2017 ident: ref_12 article-title: Parcel-based urban land use classification in megacity using airborne LiDAR, high resolution orthoimagery, and Google Street View publication-title: Comput. Environ. Urban Syst. doi: 10.1016/j.compenvurbsys.2017.03.001 – ident: ref_30 doi: 10.1007/978-3-319-24574-4_28 – ident: ref_26 doi: 10.1109/CVPR.2015.7298594 – ident: ref_46 doi: 10.3390/rs9050446 – volume: 11 start-page: 4299 year: 2018 ident: ref_34 article-title: Detection of Vehicles in Multisensor Data via Multibranch Convolutional Neural Networks publication-title: IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. doi: 10.1109/JSTARS.2018.2825099 – ident: ref_49 – volume: 38 start-page: 492 year: 2017 ident: ref_14 article-title: Predicting old-growth tropical forest attributes from very high resolution (VHR)-derived surface metrics publication-title: Int. J. Remote Sens. doi: 10.1080/01431161.2016.1266108 – volume: 3 start-page: 1777 year: 2011 ident: ref_11 article-title: Segment-Based Land Cover Mapping of a Suburban Area—Comparison of High-Resolution Remotely Sensed Datasets Using Classification Trees and Test Field Points publication-title: Remote Sens. doi: 10.3390/rs3081777 – ident: ref_21 doi: 10.20944/preprints201608.0022.v1 – volume: 55 start-page: 881 year: 2017 ident: ref_56 article-title: Dense Semantic Labeling of Subdecimeter Resolution Images With Convolutional Neural Networks publication-title: IEEE Trans. Geosci. Remote Sens. doi: 10.1109/TGRS.2016.2616585 – volume: 151 start-page: 91 year: 2019 ident: ref_66 article-title: Automatic building extraction from high-resolution aerial images and LiDAR data using gated residual refinement network publication-title: ISPRS J. Photogramm. Remote Sens. doi: 10.1016/j.isprsjprs.2019.02.019 – ident: ref_59 doi: 10.1109/CVPR.2017.243 – ident: ref_23 doi: 10.1109/ICCV.2015.169 – ident: ref_28 doi: 10.1109/CVPR.2015.7298965 – ident: ref_33 doi: 10.3390/rs10020236 – volume: 34 start-page: 58 year: 2015 ident: ref_16 article-title: Building extraction from high-resolution optical spaceborne images using the integration of support vector machine (SVM) classification, Hough transformation and perceptual grouping publication-title: Int. J. Appl. Earth Obs. Geoinf. – volume: 214 start-page: 73 year: 2018 ident: ref_10 article-title: Urban land-use mapping using a deep convolutional neural network with high spatial resolution multispectral remote sensing imagery publication-title: Remote Sens. Environ. doi: 10.1016/j.rse.2018.04.050 – volume: 15 start-page: 1100 year: 2018 ident: ref_8 article-title: Center-Point-Guided Proposal Generation for Detection of Small and Dense Buildings in Aerial Imagery publication-title: IEEE Geosci. Remote Sens. Lett. doi: 10.1109/LGRS.2018.2822760 – ident: ref_20 doi: 10.3390/rs8110954 – volume: 11 start-page: 54 year: 2017 ident: ref_15 article-title: Comprehensive survey of deep learning in remote sensing: Theories, tools, and challenges for the community publication-title: J. Appl. Remote Sens. doi: 10.1117/1.JRS.11.042609 – volume: 30 start-page: 158 year: 2014 ident: ref_17 article-title: Hierarchical segmentation of urban satellite imagery publication-title: Int. J. Appl. Earth Obs. Geoinf. – volume: 123 start-page: 258 year: 2012 ident: ref_13 article-title: Tree species classification in the Southern Alps based on the fusion of very high geometrical resolution multispectral/hyperspectral images and LiDAR data publication-title: Remote Sens. Environ. doi: 10.1016/j.rse.2012.03.013 – ident: ref_27 – volume: 11 start-page: 2615 year: 2018 ident: ref_42 article-title: Building Footprint Extraction From VHR Remote Sensing Images Combined With Normalized DSMs Using Fused Fully Convolutional Networks publication-title: IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. doi: 10.1109/JSTARS.2018.2849363 – volume: 15 start-page: 749 year: 2018 ident: ref_51 article-title: Road Extraction by Deep Residual U-Net publication-title: IEEE Geosci. Remote Sens. Lett. doi: 10.1109/LGRS.2018.2802944 – ident: ref_52 – ident: ref_43 doi: 10.1109/IGARSS.2015.7326158 – ident: ref_48 – volume: 10 start-page: 5769 year: 2017 ident: ref_47 article-title: FusionNet: Edge Aware Deep Convolutional Networks for Semantic Segmentation of Remote Sensing Harbor Images publication-title: IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. doi: 10.1109/JSTARS.2017.2747599 – volume: 39 start-page: 2481 year: 2017 ident: ref_29 article-title: SegNet: A Deep Convolutional Encoder-Decoder Architecture for Image Segmentation publication-title: IEEE Trans. Pattern Anal. Mach. Intell. doi: 10.1109/TPAMI.2016.2644615 – ident: ref_62 – ident: ref_38 doi: 10.3390/rs11080917 – volume: 18 start-page: 1713 year: 2017 ident: ref_6 article-title: On Detecting Road Regions in a Single UAV Image publication-title: IEEE Trans. Intell. Transp. Syst. doi: 10.1109/TITS.2016.2622280 – ident: ref_45 – ident: ref_18 doi: 10.3390/rs8030258 – volume: 21 start-page: 397 year: 2013 ident: ref_3 article-title: Classification method, spectral diversity, band combination and accuracy assessment evaluation for urban feature detection publication-title: Int. J. Appl. Earth Obs. Geoinf. – ident: ref_31 doi: 10.1109/ICCV.2015.178 – ident: ref_53 – volume: 140 start-page: 20 year: 2018 ident: ref_64 article-title: Beyond RGB: Very high resolution urban remote sensing with multimodal deep networks publication-title: ISPRS J. Photogramm. Remote Sens. doi: 10.1016/j.isprsjprs.2017.11.011 – ident: ref_65 doi: 10.3390/rs10010052 – volume: 70 start-page: 1365 year: 2004 ident: ref_19 article-title: Road extraction using SVM and image segmentation publication-title: Photogramm. Eng. Remote Sens. doi: 10.14358/PERS.70.12.1365 – ident: ref_24 – volume: 4 start-page: 22 year: 2016 ident: ref_32 article-title: Deep Learning for Remote Sensing Data: A Technical Tutorial on the State of the Art publication-title: IEEE Geosci. Remote Sens. Mag. doi: 10.1109/MGRS.2016.2540798 – ident: ref_25 doi: 10.1007/978-3-319-46448-0_2 – volume: 60 start-page: 104021 year: 2016 ident: ref_41 article-title: Multiple Object Extraction from Aerial Imagery with Convolutional Neural Networks publication-title: J. Imaging Sci. Technol. doi: 10.2352/J.ImagingSci.Technol.2016.60.1.010402 – ident: ref_60 doi: 10.1109/CVPRW.2017.156 – ident: ref_61 doi: 10.1109/CVPR.2015.7299173 – ident: ref_7 doi: 10.1007/978-3-030-01440-7_15 – volume: 145 start-page: 78 year: 2017 ident: ref_50 article-title: Semantic labeling in very high resolution images via a self-cascaded convolutional neural network publication-title: ISPRS J. Photogramm. Remote Sens. doi: 10.1016/j.isprsjprs.2017.12.007 – ident: ref_54 doi: 10.3390/rs10010144 – volume: 54 start-page: 4461 year: 2016 ident: ref_36 article-title: Efficient Multiple Feature Fusion With Hashing for Hyperspectral Imagery Classification: A Comparative Study publication-title: IEEE Trans. Geosci. Remote Sens. doi: 10.1109/TGRS.2016.2542342 – volume: 40 start-page: 834 year: 2018 ident: ref_55 article-title: DeepLab: Semantic Image Segmentation with Deep Convolutional Nets, Atrous Convolution, and Fully Connected CRFs publication-title: IEEE Trans. Pattern Anal. Mach. Intell. doi: 10.1109/TPAMI.2017.2699184 – volume: 15 start-page: 474 year: 2018 ident: ref_63 article-title: Fully Convolutional Networks for Semantic Segmentation of Very High Resolution Remotely Sensed Images Combined With DSM publication-title: IEEE Geosci. Remote Sens. Lett. doi: 10.1109/LGRS.2018.2795531 – volume: 135 start-page: 158 year: 2018 ident: ref_4 article-title: Classification with an edge: Improving semantic with boundary detection publication-title: ISPRS J. Photogramm. Remote Sens. doi: 10.1016/j.isprsjprs.2017.11.009 – volume: 11 start-page: 3954 year: 2018 ident: ref_57 article-title: DeepUNet: A Deep Fully Convolutional Network for Pixel-Level Sea-Land Segmentation publication-title: IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. doi: 10.1109/JSTARS.2018.2833382 – volume: 56 start-page: 4781 year: 2018 ident: ref_37 article-title: Hyperspectral Image Classification Based on Deep Deconvolution Network With Skip Architecture publication-title: IEEE Trans. Geosci. Remote Sens. doi: 10.1109/TGRS.2018.2837142 – ident: ref_58 doi: 10.3390/rs10030407 – volume: 49 start-page: 3906 year: 2011 ident: ref_22 article-title: Use of Salient Features for the Design of a Multistage Framework to Extract Roads From High-Resolution Multispectral Satellite Images publication-title: IEEE Trans. Geosci. Remote Sens. doi: 10.1109/TGRS.2011.2136381 – volume: 40 start-page: 2793 year: 2018 ident: ref_39 article-title: Learning Building Extraction in Aerial Scenes with Convolutional Networks publication-title: IEEE Trans. Pattern Anal. Mach. Intell. doi: 10.1109/TPAMI.2017.2750680 – ident: ref_44 doi: 10.1109/CVPR.2016.90 – volume: 101 start-page: 631 year: 2013 ident: ref_9 article-title: Land-Cover Mapping by Markov Modeling of Spatial-Contextual Information in Very-High-Resolution Remote Sensing Images publication-title: Proc. IEEE doi: 10.1109/JPROC.2012.2211551 – ident: ref_2 – volume: 122 start-page: 145 year: 2016 ident: ref_1 article-title: MRF-based segmentation and unsupervised classification for building and road detection in peri-urban areas of high-resolution satellite images publication-title: ISPRS J. Photogramm. Remote Sens. doi: 10.1016/j.isprsjprs.2016.10.010 – volume: 11 start-page: 1797 year: 2014 ident: ref_35 article-title: Vehicle Detection in Satellite Images by Hybrid Deep Convolutional Neural Networks publication-title: IEEE Geosci. Remote Sens. Lett. doi: 10.1109/LGRS.2014.2309695 – ident: ref_5 doi: 10.1109/ICUS.2017.8278309 – volume: 57 start-page: 574 year: 2019 ident: ref_40 article-title: Fully Convolutional Networks for Multisource Building Extraction From an Open Aerial and Satellite Imagery Data Set publication-title: IEEE Trans. Geosci. Remote Sens. doi: 10.1109/TGRS.2018.2858817 |
SSID | ssj0000331904 |
Score | 2.6040652 |
Snippet | Urban building segmentation is a prevalent research domain for very high resolution (VHR) remote sensing; however, various appearances and complicated... |
SourceID | doaj proquest crossref |
SourceType | Open Website Aggregation Database Enrichment Source Index Database |
StartPage | 1774 |
SubjectTerms | aerial photography artificial intelligence Artificial neural networks Buildings Classification deep convolutional neural network Deep learning Detection Feature extraction Feature maps Image processing Image reconstruction Image segmentation Neural networks Pattern recognition quantitative analysis Remote sensing Sampling Semantic segmentation Semantics Spatial discrimination Spatial resolution Training U-Net urban building extraction VHR remote sensing imagery |
SummonAdditionalLinks | – databaseName: ProQuest Central dbid: BENPR link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwfR3LattAcGidQ3spSR_UTVK2tJceRPRYaVenEqcJbqGhOHXJTeyuZpNCLTmyU8jfZ0ZeO4SEngzyIMG83wPwSVntuN8vcokvI2lID1qZlFGutdfkwTo0POD847QYT-X38_w8JNwWoa1yrRN7RV23jnPkB5y85CpVmn6ZX0V8NYqrq-GExlPYIhWs9QC2RsenPyebLEucEYvFcrWXNKP4_qBbkHTniVLyniXqF_Y_0Me9kTnZhhfBOxSHK3LuwBNsXsKzcKj88uYVdGc4I1z8ceIML2ZhbqgRrRfTzppGjMKV64XguRHxezwREyRqIME3nBYQ32a8teJG9L0CwoiviHNx1Db_Ag_S53lhR__Td4i_hunJ8a-jcRTOJkROFvEy0uR0pFKTLdam5vUxXMlMSopEC6_4BF-SaXLivCfzLNFpLLVTqa1jm8XGZS57A4OmbfAtCKuSWiYaE4sUNmWmlMp4VKUpUlWTdzGEz2sUVi7sFOfTFn8rii0Y3dUduofwcQM7X23SeBRqxJTYQPD26_5B211UQZgqWau8IA3vTVxLaYndstzXvkBnnaGYewh7azpWQSQX1R0DDeHD5m8SJq6QmAbba4IpKcDLKYaS7_7_il14Tp4Tj39Fqd6DwbK7xn3yTpb2fWDBWxoY5Bw priority: 102 providerName: ProQuest |
Title | Semantic Segmentation of Urban Buildings from VHR Remote Sensing Imagery Using a Deep Convolutional Neural Network |
URI | https://www.proquest.com/docview/2304049022 https://www.proquest.com/docview/2986256754 https://doaj.org/article/4d756467fa0d44bb8c35fdf6ecbca338 |
Volume | 11 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV3dT9swED9t7IG9oH2A1sEqT_Cyh4h8OLHzSIGumwaa2hXxFtnOmU1aU9SWSfz3u0tMBxrSXvZkKTkl1vk-bd_vAA6U1Y7v-0Uu8WUkDdlBK5MyyrX2miJYh4YLnM_Oi9FUfr7ML--1-uI7YR08cMe4Q1mrvCBt9iaupbT06Sz3tS_QWWcov2LrSz7vXjLV2uCMRCuWHR5pRnn94WJJWp0nSskHHqgF6v_LDrfOZfgCtkJUKI662byEJ9i8gs3QoPz77WtYTHBGPPjhxASvZqFeqBFzL6YLaxoxCN2tl4LrRcTFaCzGSKuARN_wdoD4NGO0ilvR3hEQRpwgXovjefMryB79noE62qG9Gb4N0-Hpt-NRFNolRE4W8SrSFGykUpMP1qZm2Bg-wUxKykALr7j1XpJpCt68J7cs0WkstVOprWObxcZlLtuBjWbe4BsQViW1TDQmFildykwplfGoSlOkqqaoogcf7lhYuYAlzi0tflaUUzC7qz_s7sH-mva6Q9B4lGrAK7GmYNTr9gHJQhVkofqXLPRg724dq6CKy4p3vfl4M0178H79mpSIT0ZMg_Mboikpscspd5Jv_8c8duE5xVVcHBaleg82VosbfEexy8r24akefuzDs6OTsy8TGgen51_H_VZ4fwML9PCs |
linkProvider | Directory of Open Access Journals |
linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtR3LbtQwcFTKoVxQeYmlBYyAA4eoeTixc0CItmx36ePQdlFvqe3YBambbLNb0P4U38hM1tmqAnHrKVI8sqWZ8Tw8L4B3QktD-X6BiVwecIVyUPMoD1IpnUQL1lhFBc6HR9lgxL-epWcr8LurhaG0yk4mtoK6rA29kW_R4yVFqeL40-QqoKlRFF3tRmgs2GLfzn-hyzb9ONxF-r6P4_6X051B4KcKBIZn4SyQqJNjLlFVSVVSdxUK9KHrjTfNCZpQFyUSbRznUHtxa6TNpRGxLkOdhMokJsF978F9nqAmp8r0_t7yTSdMkKFDvuiCiuvhVjNFWYJHCH5L77XjAf6S_q1K66_DQ2-Lss8L5nkEK7Z6DGt-LPr3-RNoTuwYMf_DsBN7MfZVShWrHRs1WlVs28_UnjKqUmHfBsfs2CLtLcJX9AjBhmPqkTFnbWYCU2zX2gnbqaufnuPxeGoP0n7afPSnMLoTdD6D1aqu7HNgWkQlj6SNtEUnLVE5F8pZkassFiXaMj340KGwML6DOQ3SuCzQkyF0Fzfo7sHbJexk0bfjn1DbRIklBPXabn_UzUXhr27BS5FmqE-cCkvONTJ3krrSZdZoo9DD78FmR8fCC4BpccOuPXizXMarS_EYVdn6GmFydCdT9Nj4i_9v8RrWBqeHB8XB8Gh_Ax6gzUaFZ0EsN2F11lzbl2gXzfSrlhkZnN819_8BsKgcOQ |
linkToPdf | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV1Lb9NAEB6VVAIuiKdIKbAIOHCw4sfaXh8QIk2jhEJUpQT15u6udwtSYwcnBeWv8euYcdapEIhbT5bs0Vra_ea58wB4lSqhKd_P04HNPC5RDioeZF4shBVowWojqcD50yQZzfiH0_h0B361tTCUVtnKxEZQF5WmGHmPgpd0SxWGPevSIo4Hw3eL7x5NkKKb1nacxgYiR2b9E9235dvxAM_6dRgODz8fjDw3YcDTPPFXnkD9HHKBakvIgjqt0KUfuuHIdTalaXVBJNDesRY1GTdamEzoNFSFryJf6khHuO4N2E3JK-rAbv9wcjzdRnj8COHt801P1CjK_F69RMmCP0n5H1qwGRbwly5oFNzwLtxxlil7v4HSPdgx5X245Yakf10_gPrEzPEcvml2Ys7nrmapZJVls1rJkvXdhO0lo5oV9mU0ZVODSDBIX1JIgo3n1DFjzZo8BSbZwJgFO6jKHw7_-HtqFtI8muz0hzC7lg19BJ2yKs1jYCoNCh4IEyiDLlskM55Ka9JMJmFaoGXThTftFuba9TOnsRoXOfo1tN351XZ34eWWdrHp4vFPqj6dxJaCOm83L6r6PHeMnPMijRPULlb6BecKoR7FtrCJ0UpL9Pe7sN-eY-7EwTK_Am8XXmw_IyPT7YwsTXWJNBk6lzH6b3zv_0s8h5uI_PzjeHL0BG6jAUdVaF4o9qGzqi_NUzSSVuqZQyODs-tmgN_COCHL |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Semantic+Segmentation+of+Urban+Buildings+from+VHR+Remote+Sensing+Imagery+Using+a+Deep+Convolutional+Neural+Network&rft.jtitle=Remote+sensing+%28Basel%2C+Switzerland%29&rft.au=Yaning+Yi&rft.au=Zhijie+Zhang&rft.au=Wanchang+Zhang&rft.au=Chuanrong+Zhang&rft.date=2019-07-28&rft.pub=MDPI+AG&rft.eissn=2072-4292&rft.volume=11&rft.issue=15&rft.spage=1774&rft_id=info:doi/10.3390%2Frs11151774&rft.externalDBID=DOA&rft.externalDocID=oai_doaj_org_article_4d756467fa0d44bb8c35fdf6ecbca338 |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=2072-4292&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=2072-4292&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=2072-4292&client=summon |