ScribbleSup: Scribble-Supervised Convolutional Networks for Semantic Segmentation
Large-scale data is of crucial importance for learning semantic segmentation models, but annotating per-pixel masks is a tedious and inefficient procedure. We note that for the topic of interactive image segmentation, scribbles are very widely used in academic research and commercial software, and a...
Saved in:
Published in | 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR) pp. 3159 - 3167 |
---|---|
Main Authors | , , , , |
Format | Conference Proceeding |
Language | English |
Published |
IEEE
01.06.2016
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Abstract | Large-scale data is of crucial importance for learning semantic segmentation models, but annotating per-pixel masks is a tedious and inefficient procedure. We note that for the topic of interactive image segmentation, scribbles are very widely used in academic research and commercial software, and are recognized as one of the most userfriendly ways of interacting. In this paper, we propose to use scribbles to annotate images, and develop an algorithm to train convolutional networks for semantic segmentation supervised by scribbles. Our algorithm is based on a graphical model that jointly propagates information from scribbles to unmarked pixels and learns network parameters. We present competitive object semantic segmentation results on the PASCAL VOC dataset by using scribbles as annotations. Scribbles are also favored for annotating stuff (e.g., water, sky, grass) that has no well-defined shape, and our method shows excellent results on the PASCALCONTEXT dataset thanks to extra inexpensive scribble annotations. Our scribble annotations on PASCAL VOC are available at http://research.microsoft.com/en-us/um/ people/jifdai/downloads/scribble_sup. |
---|---|
AbstractList | Large-scale data is of crucial importance for learning semantic segmentation models, but annotating per-pixel masks is a tedious and inefficient procedure. We note that for the topic of interactive image segmentation, scribbles are very widely used in academic research and commercial software, and are recognized as one of the most userfriendly ways of interacting. In this paper, we propose to use scribbles to annotate images, and develop an algorithm to train convolutional networks for semantic segmentation supervised by scribbles. Our algorithm is based on a graphical model that jointly propagates information from scribbles to unmarked pixels and learns network parameters. We present competitive object semantic segmentation results on the PASCAL VOC dataset by using scribbles as annotations. Scribbles are also favored for annotating stuff (e.g., water, sky, grass) that has no well-defined shape, and our method shows excellent results on the PASCALCONTEXT dataset thanks to extra inexpensive scribble annotations. Our scribble annotations on PASCAL VOC are available at http://research.microsoft.com/en-us/um/ people/jifdai/downloads/scribble_sup. |
Author | Jian Sun Jifeng Dai Kaiming He Jiaya Jia Di Lin |
Author_xml | – sequence: 1 surname: Di Lin fullname: Di Lin – sequence: 2 surname: Jifeng Dai fullname: Jifeng Dai – sequence: 3 surname: Jiaya Jia fullname: Jiaya Jia – sequence: 4 surname: Kaiming He fullname: Kaiming He – sequence: 5 surname: Jian Sun fullname: Jian Sun |
BookMark | eNo1jFtLw0AUhFdRsNY8-uRL_kDiObvZm28SvEHxluJr2U1OJJpmS5JW_PemqAzDzAfDnLKjLnTE2DlCigj2Mn97fk05oEpFlh2wyGqDmdLCGIl4yGYISiTKoj1h0TB8AABaZdDYGXspyr7xvqViu7mK_yGZiPpdM1AV56HbhXY7NqFzbfxI41foP4e4Dn1c0Np1Y1NO5X1N3ej2ozN2XLt2oOgv52x5e7PM75PF091Dfr1ISi7NmIhS-ApAIzk0niudVWUFGlRW7S2l9NaB9vUkLqlCAaAcr0vNPUgu5uzi97YhotWmb9au_15pbaZLIX4A1YtR0w |
CODEN | IEEPAD |
CitedBy_id | crossref_primary_10_1016_j_ipm_2021_102680 crossref_primary_10_1016_j_media_2023_102937 crossref_primary_10_1007_s11042_024_18133_y crossref_primary_10_1007_s11263_023_01862_2 crossref_primary_10_1016_j_inffus_2024_102355 crossref_primary_10_1016_j_media_2023_102934 crossref_primary_10_1109_TGRS_2021_3059088 crossref_primary_10_3390_electronics12122730 crossref_primary_10_1007_s11432_020_3065_4 crossref_primary_10_1109_TNNLS_2022_3144194 crossref_primary_10_1109_TCSVT_2021_3096814 crossref_primary_10_1016_j_media_2024_103183 crossref_primary_10_1109_TIP_2021_3132834 crossref_primary_10_1109_TMI_2021_3069634 crossref_primary_10_1109_TMM_2022_3152388 crossref_primary_10_1109_TPAMI_2022_3174529 crossref_primary_10_1007_s11263_020_01293_3 crossref_primary_10_1109_TIP_2023_3301342 crossref_primary_10_1109_ACCESS_2020_3041416 crossref_primary_10_1007_s00521_023_08816_2 crossref_primary_10_1016_j_patrec_2024_04_006 crossref_primary_10_1109_TIP_2019_2930874 crossref_primary_10_1109_JBHI_2020_3024262 crossref_primary_10_1109_TPAMI_2019_2958083 crossref_primary_10_1049_cje_2021_08_007 crossref_primary_10_1109_TPAMI_2021_3100536 crossref_primary_10_1002_lom3_10483 crossref_primary_10_1007_s11063_023_11408_9 crossref_primary_10_1007_s11548_019_02003_2 crossref_primary_10_1109_ACCESS_2023_3344098 crossref_primary_10_1007_s10489_023_04937_2 crossref_primary_10_1109_ACCESS_2022_3149587 crossref_primary_10_1016_j_neucom_2023_03_031 crossref_primary_10_1109_ACCESS_2022_3220679 crossref_primary_10_1109_TNNLS_2022_3145962 crossref_primary_10_3390_rs12193169 crossref_primary_10_1109_TIP_2021_3054464 crossref_primary_10_3390_rs15040986 crossref_primary_10_1109_TIM_2024_3400353 crossref_primary_10_1007_s11548_022_02730_z crossref_primary_10_1109_TMM_2021_3139459 crossref_primary_10_1109_TIP_2022_3148814 crossref_primary_10_1049_ipr2_13049 crossref_primary_10_1109_TMI_2022_3149168 crossref_primary_10_1109_TBME_2022_3232102 crossref_primary_10_1109_TITS_2019_2919741 crossref_primary_10_5194_essd_13_1829_2021 crossref_primary_10_1016_j_neucom_2023_126821 crossref_primary_10_1109_TPAMI_2022_3168530 crossref_primary_10_1007_s00521_022_07654_y crossref_primary_10_1109_TIP_2017_2695883 crossref_primary_10_1007_s00521_020_05669_x crossref_primary_10_1016_j_engappai_2024_108059 crossref_primary_10_1109_TIP_2021_3062726 crossref_primary_10_1007_s41064_022_00194_z crossref_primary_10_1109_TGRS_2021_3095832 crossref_primary_10_1109_TMI_2023_3312988 crossref_primary_10_1109_TGRS_2019_2926434 crossref_primary_10_1109_TPAMI_2023_3301302 crossref_primary_10_1109_TCSVT_2019_2962073 crossref_primary_10_1109_TPAMI_2023_3265865 crossref_primary_10_1109_TPAMI_2024_3350450 crossref_primary_10_1007_s12559_024_10277_1 crossref_primary_10_1088_1361_6560_abde98 crossref_primary_10_1016_j_neucom_2020_09_045 crossref_primary_10_1109_JBHI_2022_3186882 crossref_primary_10_1007_s00521_023_08250_4 crossref_primary_10_1109_TCYB_2020_2992433 crossref_primary_10_1109_TMI_2022_3233405 crossref_primary_10_1016_j_rse_2024_114101 crossref_primary_10_23919_cje_2021_00_230 crossref_primary_10_1016_j_eswa_2023_122110 crossref_primary_10_1109_TMM_2020_2991592 crossref_primary_10_1016_j_cviu_2023_103810 crossref_primary_10_1109_TPAMI_2022_3200416 crossref_primary_10_1109_ACCESS_2021_3076074 crossref_primary_10_1631_FITEE_2200299 crossref_primary_10_1007_s00521_023_08826_0 crossref_primary_10_3390_app10051679 crossref_primary_10_1109_TMM_2022_3157481 crossref_primary_10_1016_j_conbuildmat_2020_120291 crossref_primary_10_1109_TMM_2022_3162951 crossref_primary_10_1109_ACCESS_2021_3077847 crossref_primary_10_1007_s00521_022_07915_w crossref_primary_10_1109_JSTARS_2021_3137450 crossref_primary_10_3390_rs15204987 crossref_primary_10_1007_s11063_022_10902_w crossref_primary_10_1109_TPAMI_2021_3133954 crossref_primary_10_1109_TMI_2024_3363190 crossref_primary_10_1109_ACCESS_2018_2842202 crossref_primary_10_1007_s11063_021_10733_1 crossref_primary_10_1007_s00371_022_02569_0 crossref_primary_10_3390_asi6050088 crossref_primary_10_1016_j_jvcir_2024_104168 crossref_primary_10_3390_electronics11244068 crossref_primary_10_1109_TMM_2018_2890360 crossref_primary_10_1002_mp_15923 crossref_primary_10_1109_TGRS_2024_3390756 crossref_primary_10_1109_TNNLS_2021_3066850 crossref_primary_10_1109_ACCESS_2019_2899109 crossref_primary_10_1109_TPAMI_2021_3132058 crossref_primary_10_1109_TIP_2020_2995056 crossref_primary_10_3390_electronics12173732 crossref_primary_10_1007_s00170_022_08929_3 crossref_primary_10_1049_ipr2_12898 crossref_primary_10_1109_TIM_2020_3001796 crossref_primary_10_1109_JSTARS_2021_3070368 crossref_primary_10_1109_LGRS_2022_3153607 crossref_primary_10_1016_j_compmedimag_2022_102091 crossref_primary_10_1186_s10033_021_00602_2 crossref_primary_10_1109_TNNLS_2021_3081693 crossref_primary_10_1109_TPAMI_2023_3326693 crossref_primary_10_1016_j_jvcir_2023_103800 crossref_primary_10_1109_TMI_2023_3245068 crossref_primary_10_1007_s11263_022_01590_z crossref_primary_10_1109_TGRS_2023_3323926 crossref_primary_10_1109_TIP_2024_3359041 crossref_primary_10_1016_j_jksuci_2024_102012 crossref_primary_10_1109_TPAMI_2022_3193587 crossref_primary_10_1109_TNNLS_2022_3174031 crossref_primary_10_1016_j_eswa_2023_122024 crossref_primary_10_1109_JBHI_2023_3268157 crossref_primary_10_1109_ACCESS_2019_2917152 crossref_primary_10_1109_TIM_2023_3244236 crossref_primary_10_1109_TPAMI_2019_2923513 crossref_primary_10_1109_ACCESS_2020_2966647 crossref_primary_10_1109_TMI_2021_3123461 crossref_primary_10_1016_j_neucom_2021_02_093 crossref_primary_10_1109_ACCESS_2019_2953465 crossref_primary_10_1016_j_cviu_2020_103040 crossref_primary_10_1117_1_JEI_26_6_061606 crossref_primary_10_3390_rs13030394 crossref_primary_10_1109_ACCESS_2020_2969480 crossref_primary_10_1007_s11263_022_01586_9 crossref_primary_10_1109_TMM_2023_3267891 crossref_primary_10_1109_TIP_2019_2901393 crossref_primary_10_1109_TPAMI_2019_2960224 crossref_primary_10_3390_electronics12224682 crossref_primary_10_1016_j_cviu_2023_103815 crossref_primary_10_1109_ACCESS_2019_2908216 crossref_primary_10_1109_TGRS_2023_3321637 crossref_primary_10_1109_TMI_2020_3002244 crossref_primary_10_1007_s13349_024_00778_w crossref_primary_10_1080_08839514_2022_2032924 crossref_primary_10_1016_j_imavis_2023_104738 crossref_primary_10_1109_TGRS_2018_2871504 crossref_primary_10_1021_acs_analchem_2c01456 crossref_primary_10_1109_ACCESS_2020_2989331 crossref_primary_10_1109_TCSVT_2023_3336323 crossref_primary_10_1145_3589343 crossref_primary_10_12677_csa_2024_145126 crossref_primary_10_1109_ACCESS_2021_3062380 crossref_primary_10_1109_LRA_2023_3234799 crossref_primary_10_1016_j_isprsjprs_2024_03_012 crossref_primary_10_1109_TIP_2020_3011269 crossref_primary_10_1016_j_cviu_2021_103209 crossref_primary_10_1007_s10462_019_09792_7 crossref_primary_10_1109_TIP_2023_3275913 crossref_primary_10_1007_s10278_023_00931_9 crossref_primary_10_1109_JBHI_2020_3008759 crossref_primary_10_1002_cav_2023 crossref_primary_10_1109_TGRS_2020_2964675 crossref_primary_10_1109_TIP_2021_3087401 crossref_primary_10_1109_TCSVT_2023_3241641 crossref_primary_10_1109_JAS_2021_1004210 crossref_primary_10_3390_electronics13010142 crossref_primary_10_1109_TNNLS_2022_3155486 crossref_primary_10_1109_TPAMI_2023_3246102 crossref_primary_10_1007_s11042_020_08925_3 crossref_primary_10_1109_TMI_2023_3294824 crossref_primary_10_1109_TPAMI_2021_3092573 crossref_primary_10_1109_TIP_2020_3018221 crossref_primary_10_1016_j_isprsjprs_2022_07_014 crossref_primary_10_1109_TMM_2021_3061816 crossref_primary_10_11834_jig_230605 crossref_primary_10_1016_j_compmedimag_2022_102174 crossref_primary_10_1109_TMI_2018_2791721 crossref_primary_10_1007_s41745_019_0099_3 crossref_primary_10_1109_TIP_2022_3190709 crossref_primary_10_1007_s11042_023_17888_0 crossref_primary_10_1109_JSTSP_2021_3049634 crossref_primary_10_1145_3237188 crossref_primary_10_1109_TGRS_2023_3314465 crossref_primary_10_1109_TMM_2020_3021979 crossref_primary_10_1007_s10044_024_01251_6 crossref_primary_10_1109_ACCESS_2020_2975022 crossref_primary_10_1016_j_jag_2023_103499 crossref_primary_10_1109_TIV_2019_2955851 crossref_primary_10_1109_TMI_2023_3235757 crossref_primary_10_1007_s10489_022_04085_z crossref_primary_10_1016_j_knosys_2021_107033 crossref_primary_10_1016_j_cag_2023_05_009 crossref_primary_10_1109_TMI_2020_3046292 crossref_primary_10_1007_s00138_024_01562_y crossref_primary_10_1016_j_compbiomed_2023_107913 crossref_primary_10_1109_TGRS_2021_3061213 crossref_primary_10_1016_j_neunet_2020_07_011 crossref_primary_10_1109_TIP_2021_3134142 crossref_primary_10_1109_LSP_2023_3343945 crossref_primary_10_1109_MSP_2017_2742558 crossref_primary_10_1109_TGRS_2022_3224477 crossref_primary_10_1016_j_patcog_2023_109861 crossref_primary_10_1109_TPAMI_2018_2840695 crossref_primary_10_1016_j_engappai_2023_106961 crossref_primary_10_1016_j_inffus_2024_102311 crossref_primary_10_11834_jig_230628 crossref_primary_10_3390_rs10121970 crossref_primary_10_1109_TCSVT_2023_3263468 crossref_primary_10_1109_TIP_2019_2926748 crossref_primary_10_1109_ACCESS_2018_2874544 crossref_primary_10_1109_TCSVT_2020_3040343 crossref_primary_10_3233_JIFS_210569 crossref_primary_10_1007_s00138_023_01407_0 crossref_primary_10_3390_bioengineering10010116 crossref_primary_10_1109_TIP_2022_3141878 crossref_primary_10_1109_TPAMI_2022_3169881 crossref_primary_10_1109_TGRS_2020_3011913 crossref_primary_10_1007_s11042_023_15305_0 crossref_primary_10_1109_JSTARS_2023_3279863 crossref_primary_10_1109_TMM_2023_3270637 crossref_primary_10_1002_rob_22049 crossref_primary_10_1109_TPAMI_2021_3131120 crossref_primary_10_1016_j_engappai_2023_106299 crossref_primary_10_1016_j_bspc_2023_105158 crossref_primary_10_1109_TITS_2022_3141107 crossref_primary_10_1109_TPAMI_2020_3023152 crossref_primary_10_11834_jig_221121 crossref_primary_10_1016_j_jvcir_2023_103856 crossref_primary_10_1109_TCYB_2018_2885062 crossref_primary_10_1007_s11042_017_5546_4 crossref_primary_10_1007_s11432_021_3429_1 crossref_primary_10_1109_TMI_2023_3269523 crossref_primary_10_1007_s11263_020_01373_4 crossref_primary_10_1061_JCCEE5_CPENG_5065 crossref_primary_10_1007_s11263_023_01796_9 crossref_primary_10_3390_ijgi8110478 crossref_primary_10_1007_s11263_023_01807_9 crossref_primary_10_1109_TMM_2021_3126430 crossref_primary_10_1007_s11042_023_16597_y crossref_primary_10_1109_TPAMI_2020_2964205 crossref_primary_10_1016_j_jag_2023_103345 crossref_primary_10_1109_TMM_2019_2914870 crossref_primary_10_1109_TPAMI_2021_3083269 crossref_primary_10_1142_S0218001421540264 crossref_primary_10_1109_TIP_2022_3158064 crossref_primary_10_1109_TII_2020_2982995 crossref_primary_10_1109_TMI_2023_3269798 crossref_primary_10_1016_j_neucom_2024_127834 crossref_primary_10_1109_TITS_2021_3076844 crossref_primary_10_1007_s00607_021_00907_z crossref_primary_10_1016_j_isprsjprs_2023_01_021 crossref_primary_10_1109_TPAMI_2022_3227116 crossref_primary_10_1134_S105466181901005X crossref_primary_10_1016_j_engappai_2021_104172 crossref_primary_10_1109_TIP_2022_3215905 crossref_primary_10_3390_s23249846 crossref_primary_10_1109_TIP_2017_2740620 crossref_primary_10_1109_ACCESS_2024_3350176 crossref_primary_10_1016_j_isprsjprs_2022_04_012 crossref_primary_10_1109_TMM_2023_3321393 crossref_primary_10_1016_j_neunet_2023_10_009 crossref_primary_10_1109_LRA_2024_3396095 crossref_primary_10_1007_s11704_022_2468_8 crossref_primary_10_1109_TR_2022_3162346 crossref_primary_10_1109_JSTARS_2022_3144176 crossref_primary_10_1109_TIP_2022_3160399 crossref_primary_10_1007_s00521_023_09073_z crossref_primary_10_3390_sym14112396 crossref_primary_10_3390_rs12061049 crossref_primary_10_3390_rs11171986 |
ContentType | Conference Proceeding |
DBID | 6IE 6IH CBEJK RIE RIO |
DOI | 10.1109/CVPR.2016.344 |
DatabaseName | IEEE Electronic Library (IEL) Conference Proceedings IEEE Proceedings Order Plan (POP) 1998-present by volume IEEE Xplore All Conference Proceedings IEEE Electronic Library Online IEEE Proceedings Order Plans (POP) 1998-present |
DatabaseTitleList | |
Database_xml | – sequence: 1 dbid: RIE name: IEEE Electronic Library Online url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/ sourceTypes: Publisher |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Applied Sciences Computer Science |
EISBN | 9781467388511 1467388513 |
EISSN | 1063-6919 |
EndPage | 3167 |
ExternalDocumentID | 7780713 |
Genre | orig-research |
GroupedDBID | 23M 29F 29O 6IE 6IH 6IK ACGFS ALMA_UNASSIGNED_HOLDINGS CBEJK G8K IPLJI JC5 M43 RIE RIG RIO RNS |
ID | FETCH-LOGICAL-c258t-3c3bd0071ea18b2674dcd07064d064d555b9a07bfbfb25ed13006a2fc72b0523 |
IEDL.DBID | RIE |
IngestDate | Wed Jun 26 19:26:46 EDT 2024 |
IsDoiOpenAccess | false |
IsOpenAccess | true |
IsPeerReviewed | false |
IsScholarly | true |
Language | English |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-c258t-3c3bd0071ea18b2674dcd07064d064d555b9a07bfbfb25ed13006a2fc72b0523 |
OpenAccessLink | http://arxiv.org/pdf/1604.05144 |
PageCount | 9 |
ParticipantIDs | ieee_primary_7780713 |
PublicationCentury | 2000 |
PublicationDate | 2016-June |
PublicationDateYYYYMMDD | 2016-06-01 |
PublicationDate_xml | – month: 06 year: 2016 text: 2016-June |
PublicationDecade | 2010 |
PublicationTitle | 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR) |
PublicationTitleAbbrev | CVPR |
PublicationYear | 2016 |
Publisher | IEEE |
Publisher_xml | – name: IEEE |
SSID | ssj0001968189 ssj0023720 |
Score | 2.621568 |
Snippet | Large-scale data is of crucial importance for learning semantic segmentation models, but annotating per-pixel masks is a tedious and inefficient procedure. We... |
SourceID | ieee |
SourceType | Publisher |
StartPage | 3159 |
SubjectTerms | Cows Graphical models Histograms Image segmentation Labeling Semantics Training |
Title | ScribbleSup: Scribble-Supervised Convolutional Networks for Semantic Segmentation |
URI | https://ieeexplore.ieee.org/document/7780713 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1LT8JAEJ4AJ0-oYHynB4-2QLuP1iuREBMIChpuZF81RihEWg_-enf6gGg8mKbJ7iSbbna6M7Oz38wA3PSojkkshSuo0C4RxriSYNJPwpTuaU6ERIf-aMyGz-RhTuc1uN3FwhhjcvCZ8bCZ3-XrtcrQVdbhPMRDVR3qPGJFrNbenxIxq3ui3WELq6_sc2p2-i-TJwRyMS8g5EcllVyRDJowqqZQ4EfevSyVnvr6lZ3xv3M8hPY-ZM-Z7JTREdRMcgzN0sZ0yh28taSqjENFa8Hj1EoOKZdmmm3unKrj2h7Kka0dbr_xWf6iYumMC-T41rH2rjM1K8ubN2Ubr6syjilpw2xwP-sP3bLSgqt8GqZuoAKp0dowohdKn3GilbbCgBGNL6VURqLLZWwfnxqNd2BM-LHivkS_8gk0knViTsFhLNJEdIOYCEGUtb0CgTBaKn2OkSjxGbRwzRabIpfGolyu87_JF3CAPCugWZfQSD8yc2WNgFRe59z_Bs8usno |
link.rule.ids | 309,310,780,784,789,790,796,27925,54758 |
linkProvider | IEEE |
linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV3NT8IwFH9BPOgJFYzf7uDRDdjajnklElQgKGi4kX7NGGEQ2Tz419u3D4jGg1matC9Z1rTd66-vv_cewFWTqpCEgtuccmUTrrUtCAb9JEyqpvIJF2jQ7w9Y95ncT-ikBNdrXxitdUo-0w5W07t8tZAJmsrqvt_CQ9UWbFNicG7mrbWxqATM7D7B-riF-Vc2UTXr7ZfhE1K5mOMR8iOXSrqVdCrQLzqRMUjenSQWjvz6FZ_xv73cg9rGac8arrejfSjp6AAqOcq08n94ZURFIodCVoXHkdEdQsz0KFneWEXDNi3UJCvzuvnGZ75I-cwaZNzxlWUQrzXSczM7b9JUXue5J1NUg3Hndtzu2nmuBVu6tBXbnvSEQryhebMlXOYTJZVRB4woLJRSEfCGL0LzuFQrvAVj3A2l7wq0LB9COVpE-ggsxgJFeMMLCedEGvTlcSTSUuH66IsSHkMVx2y6zKJpTPPhOvlbfAk73XG_N-3dDR5OYRfnLyNqnUE5_kj0uYEEsbhIV8I3YnO1zQ |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=proceeding&rft.title=2016+IEEE+Conference+on+Computer+Vision+and+Pattern+Recognition+%28CVPR%29&rft.atitle=ScribbleSup%3A+Scribble-Supervised+Convolutional+Networks+for+Semantic+Segmentation&rft.au=Di+Lin&rft.au=Jifeng+Dai&rft.au=Jiaya+Jia&rft.au=Kaiming+He&rft.date=2016-06-01&rft.pub=IEEE&rft.eissn=1063-6919&rft.spage=3159&rft.epage=3167&rft_id=info:doi/10.1109%2FCVPR.2016.344&rft.externalDocID=7780713 |