CGNet: A Light-weight Context Guided Network for Semantic Segmentation
The demand of applying semantic segmentation model on mobile devices has been increasing rapidly. Current state-of-the-art networks have enormous amount of parameters hence unsuitable for mobile devices, while other small memory footprint models follow the spirit of classification network and ignore...
Saved in:
Published in | IEEE transactions on image processing Vol. 30; p. 1 |
---|---|
Main Authors | , , , , |
Format | Journal Article |
Language | English |
Published |
United States
IEEE
01.01.2021
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Abstract | The demand of applying semantic segmentation model on mobile devices has been increasing rapidly. Current state-of-the-art networks have enormous amount of parameters hence unsuitable for mobile devices, while other small memory footprint models follow the spirit of classification network and ignore the inherent characteristic of semantic segmentation. To tackle this problem, we propose a novel Context Guided Network (CGNet), which is a light-weight and efficient network for semantic segmentation. We first propose the Context Guided (CG) block, which learns the joint feature of both local feature and surrounding context effectively and efficiently, and further improves the joint feature with the global context. Based on the CG block, we develop CGNet which captures contextual information in all stages of the network. CGNet is specially tailored to exploit the inherent property of semantic segmentation and increase the segmentation accuracy. Moreover, CGNet is elaborately designed to reduce the number of parameters and save memory footprint. Under an equivalent number of parameters, the proposed CGNet significantly outperforms existing light-weight segmentation networks. Extensive experiments on Cityscapes and CamVid datasets verify the effectiveness of the proposed approach. Specifically, without any post-processing and multi-scale testing, the proposed CGNet achieves 64.8% mean IoU on Cityscapes with less than 0.5 M parameters. |
---|---|
AbstractList | The demand of applying semantic segmentation model on mobile devices has been increasing rapidly. Current state-of-the-art networks have enormous amount of parameters hence unsuitable for mobile devices, while other small memory footprint models follow the spirit of classification network and ignore the inherent characteristic of semantic segmentation. To tackle this problem, we propose a novel Context Guided Network (CGNet), which is a light-weight and efficient network for semantic segmentation. We first propose the Context Guided (CG) block, which learns the joint feature of both local feature and surrounding context effectively and efficiently, and further improves the joint feature with the global context. Based on the CG block, we develop CGNet which captures contextual information in all stages of the network. CGNet is specially tailored to exploit the inherent property of semantic segmentation and increase the segmentation accuracy. Moreover, CGNet is elaborately designed to reduce the number of parameters and save memory footprint. Under an equivalent number of parameters, the proposed CGNet significantly outperforms existing light-weight segmentation networks. Extensive experiments on Cityscapes and CamVid datasets verify the effectiveness of the proposed approach. Specifically, without any post-processing and multi-scale testing, the proposed CGNet achieves 64.8% mean IoU on Cityscapes with less than 0.5 M parameters. The demand of applying semantic segmentation model on mobile devices has been increasing rapidly. Current state-of-the-art networks have enormous amount of parameters hence unsuitable for mobile devices, while other small memory footprint models follow the spirit of classification network and ignore the inherent characteristic of semantic segmentation. To tackle this problem, we propose a novel Context Guided Network (CGNet), which is a light-weight and efficient network for semantic segmentation. We first propose the Context Guided (CG) block, which learns the joint feature of both local feature and surrounding context effectively and efficiently, and further improves the joint feature with the global context. Based on the CG block, we develop CGNet which captures contextual information in all stages of the network. CGNet is specially tailored to exploit the inherent property of semantic segmentation and increase the segmentation accuracy. Moreover, CGNet is elaborately designed to reduce the number of parameters and save memory footprint. Under an equivalent number of parameters, the proposed CGNet significantly outperforms existing light-weight segmentation networks. Extensive experiments on Cityscapes and CamVid datasets verify the effectiveness of the proposed approach. Specifically, without any post-processing and multi-scale testing, the proposed CGNet achieves 64.8% mean IoU on Cityscapes with less than 0.5 M parameters.The demand of applying semantic segmentation model on mobile devices has been increasing rapidly. Current state-of-the-art networks have enormous amount of parameters hence unsuitable for mobile devices, while other small memory footprint models follow the spirit of classification network and ignore the inherent characteristic of semantic segmentation. To tackle this problem, we propose a novel Context Guided Network (CGNet), which is a light-weight and efficient network for semantic segmentation. We first propose the Context Guided (CG) block, which learns the joint feature of both local feature and surrounding context effectively and efficiently, and further improves the joint feature with the global context. Based on the CG block, we develop CGNet which captures contextual information in all stages of the network. CGNet is specially tailored to exploit the inherent property of semantic segmentation and increase the segmentation accuracy. Moreover, CGNet is elaborately designed to reduce the number of parameters and save memory footprint. Under an equivalent number of parameters, the proposed CGNet significantly outperforms existing light-weight segmentation networks. Extensive experiments on Cityscapes and CamVid datasets verify the effectiveness of the proposed approach. Specifically, without any post-processing and multi-scale testing, the proposed CGNet achieves 64.8% mean IoU on Cityscapes with less than 0.5 M parameters. |
Author | Rui, Zhang Tang, Sheng Wu, Tianyi Cao, Juan Zhang, Yongdong |
Author_xml | – sequence: 1 givenname: Tianyi surname: Wu fullname: Wu, Tianyi organization: Institute of Computing Technology, Chinese Academy of Sciences, Beijing, China, 100190, and University of the Chinese Academy of Sciences, Beijing, China, 100049 – sequence: 2 givenname: Sheng surname: Tang fullname: Tang, Sheng organization: Institute of Computing Technology, Chinese Academy of Sciences, Beijing, China, 100190, and University of the Chinese Academy of Sciences, Beijing, China, 100049. (e-mail: ts@ict.ac.cn) – sequence: 3 givenname: Zhang surname: Rui fullname: Rui, Zhang organization: Institute of Computing Technology, Chinese Academy of Sciences, Beijing, China, 100190, and University of the Chinese Academy of Sciences, Beijing, China, 100049 – sequence: 4 givenname: Juan surname: Cao fullname: Cao, Juan organization: Institute of Computing Technology, Chinese Academy of Sciences, Beijing, China, 100190, and University of the Chinese Academy of Sciences, Beijing, China, 100049 – sequence: 5 givenname: Yongdong surname: Zhang fullname: Zhang, Yongdong organization: Institute of Computing Technology, Chinese Academy of Sciences, Beijing, China, 100190, and University of the Chinese Academy of Sciences, Beijing, China, 100049 |
BackLink | https://www.ncbi.nlm.nih.gov/pubmed/33306466$$D View this record in MEDLINE/PubMed |
BookMark | eNp9kc1LAzEQxYMoWj_ugiALXrxszSTZTeNNitZCUUE9h-zurKZ2NzWbRf3vTWn14MHTzJDfe0zm7ZPt1rVIyDHQIQBVF0_ThyGjjA45FYzm2RYZgBKQ0jhux55mMpUg1B7Z77o5pSAyyHfJHuec5iLPB-RmPLnDcJlcJTP78hrSD1yVZOzagJ8hmfS2wiqJyIfzb0ntfPKIjWmDLWPz0mAbTLCuPSQ7tVl0eLSpB-T55vppfJvO7ifT8dUsLQXwkKIYZXWGcRGeSSkNw7qWTEihigqAQQEGC85LWWTK1KxSvFCl4aMKsEQVnw7I-dp36d17j13Qje1KXCxMi67vdPSiORuJTEb07A86d71v43YrChSIEeWROt1QfdFgpZfeNsZ_6Z8LRSBfA6V3Xeex1qVd_zl4YxcaqF5FoWMUehWF3kQRhfSP8Mf7H8nJWmIR8RdXTDEhFP8GfXuRJg |
CODEN | IIPRE4 |
CitedBy_id | crossref_primary_10_1109_TGRS_2022_3203910 crossref_primary_10_5194_gmd_17_3533_2024 crossref_primary_10_1016_j_jag_2023_103646 crossref_primary_10_1016_j_knosys_2023_110932 crossref_primary_10_1109_JBHI_2023_3312338 crossref_primary_10_1186_s43238_025_00175_2 crossref_primary_10_3390_app14104063 crossref_primary_10_1007_s40747_023_01054_y crossref_primary_10_1016_j_bspc_2022_104278 crossref_primary_10_1016_j_neucom_2024_127975 crossref_primary_10_1016_j_compbiomed_2023_107844 crossref_primary_10_1016_j_sigpro_2023_109150 crossref_primary_10_11834_jig_230670 crossref_primary_10_1007_s10489_021_02446_8 crossref_primary_10_1109_TITS_2024_3455416 crossref_primary_10_1109_TIP_2023_3301342 crossref_primary_10_1007_s44196_024_00654_x crossref_primary_10_1109_TSMC_2024_3377280 crossref_primary_10_1109_TIM_2022_3205669 crossref_primary_10_1109_ACCESS_2024_3448364 crossref_primary_10_1007_s11554_024_01502_z crossref_primary_10_1049_itr2_12204 crossref_primary_10_1007_s00521_022_08102_7 crossref_primary_10_11834_jig_211127 crossref_primary_10_1109_TAI_2024_3355354 crossref_primary_10_1109_JSTARS_2023_3314847 crossref_primary_10_1109_ACCESS_2024_3516814 crossref_primary_10_1109_TII_2021_3123233 crossref_primary_10_12677_AIRR_2024_131003 crossref_primary_10_1155_2023_1987988 crossref_primary_10_1109_TMM_2024_3372835 crossref_primary_10_1016_j_jag_2024_104111 crossref_primary_10_1016_j_bspc_2023_104896 crossref_primary_10_1016_j_engappai_2023_107316 crossref_primary_10_1080_01431161_2021_2014077 crossref_primary_10_1587_transinf_2024EDP7025 crossref_primary_10_1007_s11554_022_01236_w crossref_primary_10_3390_math12142185 crossref_primary_10_3788_LOP220914 crossref_primary_10_1109_TNNLS_2022_3176493 crossref_primary_10_1016_j_heliyon_2024_e34782 crossref_primary_10_3390_app122412786 crossref_primary_10_3390_agriculture12101543 crossref_primary_10_1016_j_measurement_2024_116193 crossref_primary_10_1049_ipr2_13058 crossref_primary_10_1109_TCYB_2022_3162873 crossref_primary_10_1109_TMM_2022_3157995 crossref_primary_10_1051_bioconf_202411103020 crossref_primary_10_1109_TIP_2024_3444190 crossref_primary_10_3390_s24217040 crossref_primary_10_1016_j_jag_2024_103950 crossref_primary_10_1109_TGRS_2024_3351437 crossref_primary_10_1109_TIP_2024_3423390 crossref_primary_10_1016_j_neunet_2023_10_046 crossref_primary_10_1016_j_engappai_2024_109016 crossref_primary_10_1109_ACCESS_2024_3400962 crossref_primary_10_1016_j_vlsi_2024_102150 crossref_primary_10_1109_ACCESS_2024_3524454 crossref_primary_10_3390_app15052823 crossref_primary_10_1016_j_media_2024_103395 crossref_primary_10_1038_s41598_024_58965_0 crossref_primary_10_1007_s11760_023_02539_6 crossref_primary_10_1016_j_engappai_2022_105510 crossref_primary_10_1109_ACCESS_2024_3439858 crossref_primary_10_1109_TITS_2023_3330498 crossref_primary_10_1016_j_engappai_2023_106324 crossref_primary_10_1016_j_patcog_2023_110215 crossref_primary_10_1109_JSTARS_2024_3374233 crossref_primary_10_1016_j_autcon_2022_104139 crossref_primary_10_3390_math11173644 crossref_primary_10_1016_j_engappai_2024_108903 crossref_primary_10_1016_j_cag_2022_10_002 crossref_primary_10_1016_j_cag_2023_12_015 crossref_primary_10_3390_app14167273 crossref_primary_10_1016_j_jag_2024_104093 crossref_primary_10_1016_j_patcog_2022_109228 crossref_primary_10_1007_s10489_022_03932_3 crossref_primary_10_3788_LOP231461 crossref_primary_10_1016_j_knosys_2024_112520 crossref_primary_10_1109_ACCESS_2021_3071866 crossref_primary_10_3390_ijgi13100347 crossref_primary_10_1007_s10278_024_01217_4 crossref_primary_10_1109_TIP_2022_3166673 crossref_primary_10_1007_s00138_023_01500_4 crossref_primary_10_3389_fnbot_2022_1075520 crossref_primary_10_1016_j_imavis_2023_104823 crossref_primary_10_1109_TMM_2023_3290426 crossref_primary_10_3390_s25010103 crossref_primary_10_1016_j_neunet_2023_01_008 crossref_primary_10_1109_TITS_2022_3232897 crossref_primary_10_1007_s11263_024_02045_3 crossref_primary_10_1016_j_cviu_2024_104196 crossref_primary_10_1016_j_neucom_2022_12_036 crossref_primary_10_1109_TGRS_2021_3131152 crossref_primary_10_1109_TCBBIO_2024_3525409 crossref_primary_10_7717_peerj_cs_1746 crossref_primary_10_1016_j_compbiomed_2023_107898 crossref_primary_10_1029_2022JD037041 crossref_primary_10_1007_s11554_025_01661_7 crossref_primary_10_1016_j_engappai_2023_107673 crossref_primary_10_1109_ACCESS_2024_3503676 crossref_primary_10_3390_electronics13173361 crossref_primary_10_1016_j_compag_2024_109445 crossref_primary_10_1016_j_engappai_2022_105070 crossref_primary_10_3390_rs16173169 crossref_primary_10_1016_j_neucom_2023_03_006 crossref_primary_10_1088_1361_665X_abea1e crossref_primary_10_1109_TIP_2021_3122004 crossref_primary_10_1016_j_eswa_2023_120084 crossref_primary_10_1007_s11063_023_11145_z crossref_primary_10_1109_LRA_2022_3187278 crossref_primary_10_3390_s24165145 crossref_primary_10_1007_s00371_025_03853_5 crossref_primary_10_1007_s11042_023_15823_x crossref_primary_10_1016_j_imavis_2024_105408 crossref_primary_10_1109_JSEN_2024_3383233 crossref_primary_10_1007_s13042_023_02077_0 crossref_primary_10_1134_S1054661824700081 crossref_primary_10_1109_TITS_2021_3098355 crossref_primary_10_1016_j_dcan_2023_05_010 crossref_primary_10_1109_TITS_2023_3300537 crossref_primary_10_1109_TIM_2024_3427806 crossref_primary_10_1186_s12903_024_04193_x crossref_primary_10_1016_j_engappai_2023_107086 crossref_primary_10_1364_AO_449589 crossref_primary_10_1155_2022_3215083 crossref_primary_10_1117_1_JEI_32_4_043033 crossref_primary_10_1111_mice_13119 crossref_primary_10_1080_01431161_2023_2224101 crossref_primary_10_1117_1_JEI_33_1_013008 crossref_primary_10_1016_j_bspc_2024_107457 crossref_primary_10_1109_TGRS_2022_3152587 crossref_primary_10_3390_electronics13163226 crossref_primary_10_1016_j_cag_2023_07_039 crossref_primary_10_1016_j_cag_2024_104144 crossref_primary_10_1109_TCSVT_2022_3144184 crossref_primary_10_1016_j_neucom_2025_129489 crossref_primary_10_3389_fpls_2023_1320448 crossref_primary_10_3390_rs15112810 crossref_primary_10_3390_s24134302 crossref_primary_10_1007_s13369_024_09904_8 crossref_primary_10_1109_TUFFC_2022_3169684 crossref_primary_10_1016_j_compag_2022_107355 crossref_primary_10_1117_1_JEI_33_2_023042 crossref_primary_10_1109_TIV_2022_3176860 crossref_primary_10_1109_JSTARS_2022_3181303 crossref_primary_10_1007_s11042_022_12821_3 crossref_primary_10_1080_10589759_2024_2430384 crossref_primary_10_1109_TII_2024_3366221 crossref_primary_10_1109_JSEN_2023_3347584 crossref_primary_10_1016_j_asoc_2024_112065 crossref_primary_10_1016_j_engappai_2023_107736 crossref_primary_10_1016_j_compeleceng_2024_109996 crossref_primary_10_1109_TIM_2022_3212113 crossref_primary_10_1007_s10489_023_04688_0 crossref_primary_10_1007_s13735_024_00321_z crossref_primary_10_1016_j_autcon_2024_105816 crossref_primary_10_1088_1742_6596_2418_1_012081 crossref_primary_10_3390_rs14246256 crossref_primary_10_1016_j_knosys_2023_110541 crossref_primary_10_1109_TGRS_2024_3378970 crossref_primary_10_1007_s11042_024_18911_8 crossref_primary_10_53941_ijndi_2025_100006 crossref_primary_10_1007_s43684_025_00096_y crossref_primary_10_1007_s10044_024_01237_4 crossref_primary_10_1109_JSTARS_2024_3492533 crossref_primary_10_1016_j_conengprac_2023_105560 crossref_primary_10_1093_jas_skae098 crossref_primary_10_1109_TGRS_2024_3515157 crossref_primary_10_1088_1742_6596_2234_1_012012 crossref_primary_10_1002_rob_22406 crossref_primary_10_1080_01431161_2024_2411069 crossref_primary_10_1109_TIE_2023_3243265 crossref_primary_10_1109_TIP_2021_3109518 crossref_primary_10_1109_TIP_2023_3318967 crossref_primary_10_1007_s13344_024_0068_0 crossref_primary_10_1109_TIP_2023_3305090 crossref_primary_10_1109_ACCESS_2025_3546946 crossref_primary_10_3390_rs16091478 crossref_primary_10_1016_j_autcon_2025_106068 crossref_primary_10_1109_TIP_2023_3298475 crossref_primary_10_1016_j_imavis_2021_104195 crossref_primary_10_1117_1_JEI_33_3_033015 crossref_primary_10_3390_electronics13153053 crossref_primary_10_1587_transinf_2021EDP7254 crossref_primary_10_3390_s23239488 crossref_primary_10_1016_j_eswa_2022_118537 crossref_primary_10_1016_j_isprsjprs_2024_03_001 crossref_primary_10_1109_TITS_2022_3186587 crossref_primary_10_1109_TITS_2022_3161141 crossref_primary_10_1109_ACCESS_2024_3414859 crossref_primary_10_1109_TMM_2024_3405648 crossref_primary_10_1109_TITS_2023_3292278 crossref_primary_10_11834_jig_230605 crossref_primary_10_1088_1361_6501_ad9106 crossref_primary_10_1016_j_patcog_2022_109289 crossref_primary_10_1109_ACCESS_2024_3511430 crossref_primary_10_1109_LRA_2024_3349812 crossref_primary_10_20965_jaciii_2023_p0673 crossref_primary_10_1007_s00371_024_03569_y crossref_primary_10_1109_TIP_2024_3425048 crossref_primary_10_3389_fnbot_2023_1119231 crossref_primary_10_1016_j_compbiomed_2023_106932 crossref_primary_10_3390_rs15194708 crossref_primary_10_1109_TII_2022_3233674 crossref_primary_10_1109_TIM_2023_3341127 crossref_primary_10_1007_s11220_021_00375_x crossref_primary_10_3390_rs14164095 crossref_primary_10_1109_JSEN_2023_3304623 crossref_primary_10_1016_j_compbiomed_2022_106531 crossref_primary_10_3390_electronics13122406 crossref_primary_10_1080_2150704X_2023_2185110 crossref_primary_10_1109_TNNLS_2022_3230821 crossref_primary_10_1007_s00530_024_01596_2 crossref_primary_10_3390_s25061786 crossref_primary_10_1002_int_22804 crossref_primary_10_1109_TITS_2022_3182311 crossref_primary_10_1007_s11432_021_3590_1 crossref_primary_10_3233_THC_230278 crossref_primary_10_3390_app13031493 crossref_primary_10_1080_22797254_2023_2289616 crossref_primary_10_3390_rs14215399 crossref_primary_10_2139_ssrn_4100186 crossref_primary_10_1007_s10514_023_10113_9 crossref_primary_10_1109_TITS_2023_3248089 crossref_primary_10_3390_app12189095 crossref_primary_10_1016_j_autcon_2024_105614 crossref_primary_10_1109_TCE_2024_3373630 crossref_primary_10_3390_rs16132308 crossref_primary_10_1016_j_compbiomed_2024_108733 crossref_primary_10_3724_SP_J_1089_2022_19255 crossref_primary_10_3390_electronics13010043 crossref_primary_10_1007_s11042_023_15307_y crossref_primary_10_1007_s11554_023_01290_y crossref_primary_10_1109_TCSVT_2023_3298796 crossref_primary_10_1016_j_engappai_2025_110375 crossref_primary_10_3390_electronics11193238 crossref_primary_10_1038_s41597_024_03288_y crossref_primary_10_1109_TIP_2021_3122293 crossref_primary_10_34133_plantphenomics_0025 crossref_primary_10_1016_j_compag_2024_109544 crossref_primary_10_1109_JSEN_2024_3410403 crossref_primary_10_3390_rs13214357 crossref_primary_10_3389_fpls_2022_1043884 crossref_primary_10_1007_s00521_022_06932_z crossref_primary_10_1109_JSEN_2022_3188697 crossref_primary_10_3390_info15040230 crossref_primary_10_1016_j_eswa_2024_124586 crossref_primary_10_1117_1_JEI_33_4_043014 crossref_primary_10_1109_TETCI_2023_3245103 crossref_primary_10_1016_j_neucom_2022_11_062 crossref_primary_10_3390_cells11111830 crossref_primary_10_1109_TMM_2024_3428349 crossref_primary_10_1016_j_bspc_2024_107062 crossref_primary_10_1109_TCSVT_2023_3271523 crossref_primary_10_1109_TITS_2024_3383405 crossref_primary_10_3390_biomimetics8040356 crossref_primary_10_1109_TCDS_2022_3225200 crossref_primary_10_2139_ssrn_4125814 crossref_primary_10_1109_TIP_2022_3162101 crossref_primary_10_3390_rs14246291 crossref_primary_10_1109_TIM_2023_3269107 crossref_primary_10_3390_electronics13183699 crossref_primary_10_1016_j_jag_2024_104258 crossref_primary_10_1007_s11042_023_17916_z crossref_primary_10_1007_s11263_022_01677_7 crossref_primary_10_1088_1742_6596_1966_1_012047 crossref_primary_10_1109_ACCESS_2023_3345790 crossref_primary_10_1016_j_autcon_2021_103833 crossref_primary_10_1364_OL_441817 crossref_primary_10_1109_TMI_2022_3215580 crossref_primary_10_1007_s41348_022_00584_w crossref_primary_10_1038_s41598_022_09452_x crossref_primary_10_1016_j_isprsjprs_2022_09_004 crossref_primary_10_3389_fnbot_2023_1204418 crossref_primary_10_1016_j_eswa_2024_125793 crossref_primary_10_1007_s11042_023_17260_2 crossref_primary_10_1109_TIM_2022_3224526 crossref_primary_10_1109_ACCESS_2023_3266251 crossref_primary_10_1007_s12559_025_10407_3 crossref_primary_10_1109_JSEN_2021_3062660 crossref_primary_10_1016_j_jvcir_2023_104028 crossref_primary_10_1007_s11227_023_05112_7 crossref_primary_10_1088_1742_6596_2010_1_012128 crossref_primary_10_1109_TIM_2021_3124053 crossref_primary_10_1109_TITS_2023_3348631 crossref_primary_10_1016_j_aei_2024_102964 crossref_primary_10_1109_TCSVT_2023_3293166 crossref_primary_10_1137_23M1577663 crossref_primary_10_1016_j_neucom_2021_07_019 crossref_primary_10_3390_app14219819 crossref_primary_10_1049_ipr2_12816 crossref_primary_10_1109_TIM_2022_3214605 crossref_primary_10_3390_rs17040685 crossref_primary_10_1007_s00521_022_07474_0 crossref_primary_10_1007_s10489_021_02437_9 crossref_primary_10_1109_TAI_2023_3341976 crossref_primary_10_3390_app142411741 crossref_primary_10_1007_s40747_023_01031_5 crossref_primary_10_1016_j_jag_2021_102456 crossref_primary_10_1109_TCSVT_2022_3216313 crossref_primary_10_1080_01431161_2022_2073795 crossref_primary_10_3390_rs14246193 crossref_primary_10_1007_s00371_023_03043_1 crossref_primary_10_1109_JBHI_2024_3350245 crossref_primary_10_1016_j_neucom_2021_12_003 crossref_primary_10_1109_ACCESS_2025_3529812 crossref_primary_10_3390_s24175460 crossref_primary_10_1109_TIP_2020_3048682 crossref_primary_10_1007_s00530_024_01417_6 crossref_primary_10_1109_TGRS_2022_3175613 crossref_primary_10_1109_TIM_2025_3547485 crossref_primary_10_1109_TMM_2021_3094333 crossref_primary_10_3724_SP_J_1089_2022_18909 crossref_primary_10_1109_TGRS_2022_3200872 crossref_primary_10_1088_1402_4896_ad69d5 crossref_primary_10_11834_jig_230653 crossref_primary_10_3390_s23146382 crossref_primary_10_1007_s00371_025_03800_4 crossref_primary_10_1007_s00521_025_11008_9 crossref_primary_10_1016_j_knosys_2022_109832 crossref_primary_10_1016_j_robot_2024_104900 crossref_primary_10_1109_TCBB_2022_3195705 crossref_primary_10_5194_gmd_18_1017_2025 crossref_primary_10_1007_s11263_024_02318_x crossref_primary_10_3390_s24010095 crossref_primary_10_3390_rs15051325 crossref_primary_10_1109_ACCESS_2024_3359425 crossref_primary_10_1109_TCSVT_2024_3376773 crossref_primary_10_1109_TGRS_2023_3330856 crossref_primary_10_1109_TIP_2023_3272283 crossref_primary_10_1007_s13042_021_01443_0 crossref_primary_10_1109_JSTARS_2021_3078483 crossref_primary_10_1109_LGRS_2023_3304309 crossref_primary_10_3389_fpls_2023_1268218 crossref_primary_10_1007_s00138_025_01667_y crossref_primary_10_1109_TITS_2020_3044672 crossref_primary_10_3390_electronics13153033 |
Cites_doi | 10.1109/IVS.2017.7995966 10.1109/CVPR.2018.00199 10.1109/CVPR.2017.518 10.1109/CVPR.2015.7298965 10.1109/CVPR.2017.549 10.1109/ICCV.2015.179 10.1109/CVPR.2017.243 10.1007/978-3-030-01219-9_25 10.1109/CVPR.2018.00747 10.1007/978-3-030-01240-3_17 10.1109/TPAMI.2016.2644615 10.1109/CVPR.2017.660 10.24963/ijcai.2018/161 10.1109/CVPRW.2018.00201 10.1007/978-3-030-58520-4_3 10.1162/pres.1997.6.4.355 10.1007/978-3-319-46478-7_25 10.1109/ICIP.2019.8803154 10.1109/CVPR.2016.482 10.1109/ICME.2019.00166 10.1109/CVPR.2016.90 10.1109/CVPR.2018.00388 10.24963/ijcai.2017/479 10.1007/978-3-540-88682-2_5 10.1109/ITSC.2013.6728473 10.1109/CVPR.2018.00254 10.1109/TPAMI.2018.2890637 10.1109/CVPR.2016.396 10.1109/CVPR.2018.00474 10.1109/TMM.2017.2729786 10.1007/978-3-319-46487-9_32 10.1109/CVPR.2017.353 10.1007/978-3-030-01261-8_20 10.1007/978-3-030-01249-6_34 10.1109/CVPR.2015.7298594 10.1109/CVPR.2019.00326 10.1109/ICCV.2017.224 10.1109/CVPR.2019.00975 10.1109/TITS.2017.2750080 10.1109/CVPR.2018.00813 10.1109/CVPR.2016.350 |
ContentType | Journal Article |
Copyright | Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2021 |
Copyright_xml | – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2021 |
DBID | 97E RIA RIE AAYXX CITATION NPM 7SC 7SP 8FD JQ2 L7M L~C L~D 7X8 |
DOI | 10.1109/TIP.2020.3042065 |
DatabaseName | IEEE All-Society Periodicals Package (ASPP) 2005–Present IEEE All-Society Periodicals Package (ASPP) 1998–Present IEEE Electronic Library (IEL) CrossRef PubMed Computer and Information Systems Abstracts Electronics & Communications Abstracts Technology Research Database ProQuest Computer Science Collection Advanced Technologies Database with Aerospace Computer and Information Systems Abstracts Academic Computer and Information Systems Abstracts Professional MEDLINE - Academic |
DatabaseTitle | CrossRef PubMed Technology Research Database Computer and Information Systems Abstracts – Academic Electronics & Communications Abstracts ProQuest Computer Science Collection Computer and Information Systems Abstracts Advanced Technologies Database with Aerospace Computer and Information Systems Abstracts Professional MEDLINE - Academic |
DatabaseTitleList | MEDLINE - Academic PubMed Technology Research Database |
Database_xml | – sequence: 1 dbid: NPM name: PubMed url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed sourceTypes: Index Database – sequence: 2 dbid: RIE name: IEEE Electronic Library (IEL) url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/ sourceTypes: Publisher |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Applied Sciences Engineering |
EISSN | 1941-0042 |
EndPage | 1 |
ExternalDocumentID | 33306466 10_1109_TIP_2020_3042065 9292449 |
Genre | orig-research Journal Article |
GrantInformation_xml | – fundername: National Key Research and Development Program of China grantid: 2017YFC0820605 – fundername: National Natural Science Foundation of China grantid: 61525206, U1703261, 61871004 |
GroupedDBID | --- -~X .DC 0R~ 29I 4.4 5GY 6IK 97E AAJGR AARMG AASAJ AAWTH ABAZT ABQJQ ABVLG ACGFO ACGFS ACIWK AENEX AGQYO AHBIQ AKJIK AKQYR ALMA_UNASSIGNED_HOLDINGS ASUFR ATWAV BEFXN BFFAM BGNUA BKEBE BPEOZ CS3 DU5 EBS EJD F5P HZ~ IFIPE IPLJI JAVBF LAI M43 MS~ O9- OCL P2P RIA RIE RNS TAE TN5 53G 5VS AAYOK AAYXX ABFSI AETIX AGSQL AI. AIBXA ALLEH CITATION E.L H~9 ICLAB IFJZH RIG VH1 NPM PKN Z5M 7SC 7SP 8FD JQ2 L7M L~C L~D 7X8 |
ID | FETCH-LOGICAL-c413t-e485f5e00135777a2eff724749bd1121b1aeb33c7b59af2d93b9ca38d1ece9eb3 |
IEDL.DBID | RIE |
ISSN | 1057-7149 1941-0042 |
IngestDate | Fri Jul 11 05:24:46 EDT 2025 Mon Jun 30 10:14:49 EDT 2025 Wed Feb 19 02:30:26 EST 2025 Tue Jul 01 02:03:24 EDT 2025 Thu Apr 24 23:09:07 EDT 2025 Wed Aug 27 02:33:41 EDT 2025 |
IsPeerReviewed | true |
IsScholarly | true |
Language | English |
License | https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html https://doi.org/10.15223/policy-029 https://doi.org/10.15223/policy-037 |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-c413t-e485f5e00135777a2eff724749bd1121b1aeb33c7b59af2d93b9ca38d1ece9eb3 |
Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 content type line 23 |
ORCID | 0000-0001-8691-8549 0000-0003-3573-2407 0000-0002-7857-1546 0000-0001-7434-0487 0000-0002-1151-1792 |
PMID | 33306466 |
PQID | 2471914803 |
PQPubID | 85429 |
PageCount | 1 |
ParticipantIDs | proquest_miscellaneous_2470628457 ieee_primary_9292449 pubmed_primary_33306466 crossref_citationtrail_10_1109_TIP_2020_3042065 proquest_journals_2471914803 crossref_primary_10_1109_TIP_2020_3042065 |
ProviderPackageCode | CITATION AAYXX |
PublicationCentury | 2000 |
PublicationDate | 2021-01-01 |
PublicationDateYYYYMMDD | 2021-01-01 |
PublicationDate_xml | – month: 01 year: 2021 text: 2021-01-01 day: 01 |
PublicationDecade | 2020 |
PublicationPlace | United States |
PublicationPlace_xml | – name: United States – name: New York |
PublicationTitle | IEEE transactions on image processing |
PublicationTitleAbbrev | TIP |
PublicationTitleAlternate | IEEE Trans Image Process |
PublicationYear | 2021 |
Publisher | IEEE The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Publisher_xml | – name: IEEE – name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
References | ref12 ref59 ref58 ref53 ref52 ref55 ref11 chen (ref15) 2016 ref54 ref10 ref17 ref19 wu (ref34) 2019 ref18 chen (ref9) 2018 ref50 azuma (ref3) 1997; 6 ref46 zhang (ref43) 2012 ref45 ref48 ref42 ref41 ref44 paszke (ref13) 2016 ref8 ref7 yu (ref14) 2015 ref4 ref6 ref5 howard (ref51) 2017 liu (ref40) 2017 kingma (ref56) 2015 ref35 ref37 ref36 treml (ref16) 2016; 2 ref33 ref32 chen (ref23) 2018 ref2 ref39 ref38 hu (ref49) 2017 chen (ref57) 2020 ref24 ref26 ref25 ref20 yuan (ref31) 2018 ref22 ref21 ref27 evening (ref1) 2012 ref29 liu (ref28) 2019 chen (ref30) 2017 bahdanau (ref47) 2014 |
References_xml | – ident: ref53 doi: 10.1109/IVS.2017.7995966 – ident: ref8 doi: 10.1109/CVPR.2018.00199 – ident: ref58 doi: 10.1109/CVPR.2017.518 – start-page: 1520 year: 2017 ident: ref40 article-title: Learning affinity via spatial propagation networks publication-title: Proc Adv Neural Inf Process Syst – ident: ref4 doi: 10.1109/CVPR.2015.7298965 – ident: ref18 doi: 10.1109/CVPR.2017.549 – ident: ref42 doi: 10.1109/ICCV.2015.179 – ident: ref7 doi: 10.1109/CVPR.2017.243 – start-page: 109 year: 2012 ident: ref43 article-title: Efficient inference for fully-connected CRFs with stationarity publication-title: Proc IEEE Conf Comput Vis Pattern Recognit – ident: ref11 doi: 10.1007/978-3-030-01219-9_25 – ident: ref37 doi: 10.1109/CVPR.2018.00747 – year: 2019 ident: ref28 article-title: FDDWNet: A lightweight convolutional neural network for real-time sementic segmentation publication-title: arXiv 1911 00632 – year: 2015 ident: ref14 article-title: Multi-scale context aggregation by dilated convolutions publication-title: arXiv 1511 07122 – ident: ref41 doi: 10.1007/978-3-030-01240-3_17 – year: 2016 ident: ref13 article-title: ENet: A deep neural network architecture for real-time semantic segmentation publication-title: ArXiv 1606 02147 – ident: ref20 doi: 10.1109/TPAMI.2016.2644615 – ident: ref17 doi: 10.1109/CVPR.2017.660 – ident: ref55 doi: 10.24963/ijcai.2018/161 – volume: 2 start-page: 7 year: 2016 ident: ref16 article-title: Speeding up semantic segmentation for autonomous driving publication-title: Proc MLITS NIPS Workshop – ident: ref25 doi: 10.1109/CVPRW.2018.00201 – year: 2012 ident: ref1 publication-title: Adobe Photoshop CS3 for photographers a professional image editor's guide to the creative use of Photoshop for the Macintosh and PC – ident: ref46 doi: 10.1007/978-3-030-58520-4_3 – year: 2018 ident: ref31 article-title: OCNet: Object context network for scene parsing publication-title: arXiv 1809 00916 – volume: 6 start-page: 355 year: 1997 ident: ref3 article-title: A survey of augmented reality publication-title: Presence Teleoperators Virtual Environ doi: 10.1162/pres.1997.6.4.355 – ident: ref44 doi: 10.1007/978-3-319-46478-7_25 – ident: ref27 doi: 10.1109/ICIP.2019.8803154 – ident: ref45 doi: 10.1109/CVPR.2016.482 – start-page: 1 year: 2015 ident: ref56 article-title: Adam: A method for stochastic optimization publication-title: Proc ICLR – ident: ref33 doi: 10.1109/ICME.2019.00166 – ident: ref6 doi: 10.1109/CVPR.2016.90 – year: 2017 ident: ref49 article-title: Squeeze-and-Excitation networks publication-title: arXiv 1709 01507 – start-page: 8699 year: 2018 ident: ref23 article-title: Searching for efficient multi-scale architectures for dense image prediction publication-title: Proc Adv Neural Inf Process Syst – ident: ref10 doi: 10.1109/CVPR.2018.00388 – year: 2017 ident: ref30 article-title: Rethinking atrous convolution for semantic image segmentation publication-title: arXiv 1706 05587 – year: 2017 ident: ref51 article-title: MobileNets: Efficient convolutional neural networks for mobile vision applications publication-title: arXiv 1704 04861 – ident: ref39 doi: 10.24963/ijcai.2017/479 – ident: ref24 doi: 10.1007/978-3-540-88682-2_5 – year: 2018 ident: ref9 article-title: Encoder-decoder with atrous separable convolution for semantic image segmentation publication-title: arXiv 1802 02611 – start-page: 1 year: 2020 ident: ref57 article-title: Fasterseg: Searching for faster real-time semantic segmentation publication-title: Proc Int Conf Learn Represent – ident: ref2 doi: 10.1109/ITSC.2013.6728473 – ident: ref36 doi: 10.1109/CVPR.2018.00254 – year: 2019 ident: ref34 article-title: Consensus feature network for scene parsing publication-title: arXiv 1907 12411 – ident: ref38 doi: 10.1109/TPAMI.2018.2890637 – year: 2016 ident: ref15 article-title: DeepLab: Semantic image segmentation with deep convolutional nets, atrous convolution, and fully connected CRFs publication-title: arXiv 1606 00915 – ident: ref48 doi: 10.1109/CVPR.2016.396 – ident: ref54 doi: 10.1109/CVPR.2018.00474 – ident: ref32 doi: 10.1109/TMM.2017.2729786 – year: 2014 ident: ref47 article-title: Neural machine translation by jointly learning to align and translate publication-title: arXiv 1409 0473 – ident: ref59 doi: 10.1007/978-3-319-46487-9_32 – ident: ref19 doi: 10.1109/CVPR.2017.353 – ident: ref22 doi: 10.1007/978-3-030-01261-8_20 – ident: ref12 doi: 10.1007/978-3-030-01249-6_34 – ident: ref52 doi: 10.1109/CVPR.2015.7298594 – ident: ref35 doi: 10.1109/CVPR.2019.00326 – ident: ref29 doi: 10.1109/ICCV.2017.224 – ident: ref26 doi: 10.1109/CVPR.2019.00975 – ident: ref21 doi: 10.1109/TITS.2017.2750080 – ident: ref50 doi: 10.1109/CVPR.2018.00813 – ident: ref5 doi: 10.1109/CVPR.2016.350 |
SSID | ssj0014516 |
Score | 2.7207959 |
Snippet | The demand of applying semantic segmentation model on mobile devices has been increasing rapidly. Current state-of-the-art networks have enormous amount of... |
SourceID | proquest pubmed crossref ieee |
SourceType | Aggregation Database Index Database Enrichment Source Publisher |
StartPage | 1 |
SubjectTerms | Computational modeling Computer architecture Context Context Guided Context modeling Electronic devices Global Context Image segmentation Mathematical models Memory devices Mobile handsets Parameters Post-production processing Predictive models Semantic Segmentation Semantics Surrounding Context |
Title | CGNet: A Light-weight Context Guided Network for Semantic Segmentation |
URI | https://ieeexplore.ieee.org/document/9292449 https://www.ncbi.nlm.nih.gov/pubmed/33306466 https://www.proquest.com/docview/2471914803 https://www.proquest.com/docview/2470628457 |
Volume | 30 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV3fa9swED7SPpTsob-yrd7aokFfBnNiW3Jk9a2Epe1YwmAJ5M1YsjzKVqe0Ni3763eSZW-UtfTNIMm2dCfuO53uO4ATOTak7Cr2ozwc-wwNlp-FReInuWZo3wxIMYnCs_n4Ysm-rOJVDz51uTBaa3v5TA_No43l52tVm6OyEZpytEZiAzbQcWtytbqIgSk4ayObMfc5wv42JBmI0eLyGzqCEfqnqKFocvuwRalB3pYa8a81suVVnkaa1uJMd2DW_mtz0eTnsK7kUP1-ROP40snswraDnuSs0ZU96OlyH3YcDCVuk9_tw6t_OAoHMJ2cz3V1Ss7IV8s5cm_PUolltXqoyHl9lePoeXObnCAEJt_1NYrrSuHDj2uX2lS-huX082Jy4bviC75Cu1b5miVxEWsDEWPOeRbpouAR40zIHDFaKMMM_XCquIxFVkS5oFKojCZ5qJUW2PQGNst1qQ-ASBUKFaBnFhQJyynNBIKMMKNMUakKHnowaoWQKsdMbgpk_EqthxKIFCWYGgmmToIefOxG3DSsHM_0HZjF7_q5dffgsJVz6rbtXYrzM3x3SUA9-NA144YzUZSs1Ova9jF5pyzmHrxt9KN7d6tW7_7_zffQj8yVGHuCcwib1W2tjxDTVPLYKvMf98vtVQ |
linkProvider | IEEE |
linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwzV3NbtQwEB6VIkE5UGj5CRQwEhw4ZDexnXWMxKEqbHfpdoXEVuotjR0HVdAsYhMVeBZehXdj7DgBIeBWiVsk_0S2x55vPDOfAZ6okSVl10lIi3gUclRYYR6XaZgWhqN-syDFJgofzkeTI_76ODleg299LowxxgWfmYH9dL78Yqkbe1U2RFWO2kj6EMoD8-UcDbTVi-lLXM2nlI5fLfYmoX9DINR4PNeh4WlSJsYinUQIkVNTloJywaUqEGrEKs7RnGRaqETmJS0kU1LnLC1io43EIuz3ElxGnJHQNjus91HYJ26dLzURoUBDo3OCRnK4mL5B05OiRYx7ApX8BlxhzGJ9R8b4U_-5B13-jm2djhtvwvdudtrQlveDplYD_fU34sj_dfpuwHUPrsluuxtuwpqptmDTA23ij7HVFlz7hYVxG8Z7-3NTPye7ZOZYVc7dbTFxvF2fa7LfnBbYet7GyxME-eStOUOBPNX48e7MJ29Vt-DoQsZ2G9arZWXuAlE6ljpC2zMqU14wlkuEUXHOuGZKlyIOYNgteqY997p9AuRD5mywSGYoMZmVmMxLTADP-hYfW96Rf9Tdtovd1_PrHMBOJ1eZP5hWGY7PMvqlEQvgcV-MR4r1E-WVWTaujs2s5YkI4E4rj33fnRjf-_M_H8HVyeJwls2m84P7sEFtAJC7r9qB9fpTYx4ggqvVQ7eRCJxctOj9AFGbTDA |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=CGNet%3A+A+Light-weight+Context+Guided+Network+for+Semantic+Segmentation&rft.jtitle=IEEE+transactions+on+image+processing&rft.au=Wu%2C+Tianyi&rft.au=Tang%2C+Sheng&rft.au=Rui%2C+Zhang&rft.au=Cao%2C+Juan&rft.date=2021-01-01&rft.pub=IEEE&rft.issn=1057-7149&rft.spage=1&rft.epage=1&rft_id=info:doi/10.1109%2FTIP.2020.3042065&rft_id=info%3Apmid%2F33306466&rft.externalDocID=9292449 |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1057-7149&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1057-7149&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1057-7149&client=summon |