Review of Visual Saliency Detection With Comprehensive Information
The visual saliency detection model simulates the human visual system to perceive the scene and has been widely used in many vision tasks. With the development of acquisition technology, more comprehensive information, such as depth cue, inter-image correspondence, or temporal relationship, is avail...
Saved in:
Published in | IEEE transactions on circuits and systems for video technology Vol. 29; no. 10; pp. 2941 - 2959 |
---|---|
Main Authors | , , , , , |
Format | Journal Article |
Language | English |
Published |
New York
IEEE
01.10.2019
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Abstract | The visual saliency detection model simulates the human visual system to perceive the scene and has been widely used in many vision tasks. With the development of acquisition technology, more comprehensive information, such as depth cue, inter-image correspondence, or temporal relationship, is available to extend image saliency detection to RGBD saliency detection, co-saliency detection, or video saliency detection. The RGBD saliency detection model focuses on extracting the salient regions from RGBD images by combining the depth information. The co-saliency detection model introduces the inter-image correspondence constraint to discover the common salient object in an image group. The goal of the video saliency detection model is to locate the motion-related salient object in video sequences, which considers the motion cue and spatiotemporal constraint jointly. In this paper, we review different types of saliency detection algorithms, summarize the important issues of the existing methods, and discuss the existent problems and future works. Moreover, the evaluation datasets and quantitative measurements are briefly introduced, and the experimental analysis and discussion are conducted to provide a holistic overview of different saliency detection methods. |
---|---|
AbstractList | The visual saliency detection model simulates the human visual system to perceive the scene and has been widely used in many vision tasks. With the development of acquisition technology, more comprehensive information, such as depth cue, inter-image correspondence, or temporal relationship, is available to extend image saliency detection to RGBD saliency detection, co-saliency detection, or video saliency detection. The RGBD saliency detection model focuses on extracting the salient regions from RGBD images by combining the depth information. The co-saliency detection model introduces the inter-image correspondence constraint to discover the common salient object in an image group. The goal of the video saliency detection model is to locate the motion-related salient object in video sequences, which considers the motion cue and spatiotemporal constraint jointly. In this paper, we review different types of saliency detection algorithms, summarize the important issues of the existing methods, and discuss the existent problems and future works. Moreover, the evaluation datasets and quantitative measurements are briefly introduced, and the experimental analysis and discussion are conducted to provide a holistic overview of different saliency detection methods. |
Author | Huang, Qingming Cong, Runmin Lin, Weisi Fu, Huazhu Cheng, Ming-Ming Lei, Jianjun |
Author_xml | – sequence: 1 givenname: Runmin orcidid: 0000-0003-0972-4008 surname: Cong fullname: Cong, Runmin email: rmcong@tju.edu.cn organization: School of Electrical and Information Engineering, Tianjin University, Tianjin, China – sequence: 2 givenname: Jianjun orcidid: 0000-0003-3171-7680 surname: Lei fullname: Lei, Jianjun email: jjlei@tju.edu.cn organization: School of Electrical and Information Engineering, Tianjin University, Tianjin, China – sequence: 3 givenname: Huazhu orcidid: 0000-0002-9702-5524 surname: Fu fullname: Fu, Huazhu email: huazhufu@gmail.com organization: Inception Institute of Artificial Intelligence, Abu Dhabi, United Arab Emirates – sequence: 4 givenname: Ming-Ming orcidid: 0000-0001-5550-8758 surname: Cheng fullname: Cheng, Ming-Ming email: cmm@nankai.edu.cn organization: School of Computer and Control Engineering, Nankai University, Tianjin, China – sequence: 5 givenname: Weisi orcidid: 0000-0001-9866-1947 surname: Lin fullname: Lin, Weisi email: wslin@ntu.edu.sg organization: School of Computer Science Engineering, Nanyang Technological University, Singapore – sequence: 6 givenname: Qingming orcidid: 0000-0001-7542-296X surname: Huang fullname: Huang, Qingming email: qmhuang@ucas.ac.cn organization: School of Computer and Control Engineering, University of Chinese Academy of Sciences, Beijing, China |
BookMark | eNp9kE1Lw0AQhhepYFv9A3oJeE6d_czmqPWrUBBsrcew3UzoljRbN2ml_970Aw8ePM3AvM_M8PRIp_IVEnJNYUAppHfT4WQ2HTCgesB0ApqzM9KlUuqYMZCdtgdJY82ovCC9ul4CUKFF0iUP77h1-B35Ipq5emPKaGJKh5XdRY_YoG2cr6JP1yyioV-tAy6wqt0Wo1FV-LAy-_ElOS9MWePVqfbJx_PTdPgaj99eRsP7cWxZKpsYxTyxTGmKBRqRpwlYDhrQGFSaSw00TwyghDxn8zyleSrU3CohGbdoZc775Pa4dx381wbrJlv6TajakxnjAJyDEmmb0seUDb6uAxaZdc3hzyYYV2YUsr2x7GAs2xvLTsZalP1B18GtTNj9D90cIYeIv4AWSqWg-A-bZ3mV |
CODEN | ITCTEM |
CitedBy_id | crossref_primary_10_1109_TCSVT_2022_3183641 crossref_primary_10_1007_s11263_024_02051_5 crossref_primary_10_1007_s41095_020_0199_z crossref_primary_10_1016_j_eswa_2023_119631 crossref_primary_10_1016_j_inpa_2022_05_005 crossref_primary_10_3390_electronics12071618 crossref_primary_10_1109_LSP_2020_3020735 crossref_primary_10_1109_TCYB_2021_3095512 crossref_primary_10_3390_cancers14071674 crossref_primary_10_1109_TCSVT_2021_3095843 crossref_primary_10_1007_s00138_023_01405_2 crossref_primary_10_1109_TPAMI_2022_3166451 crossref_primary_10_1109_JPHOT_2022_3192014 crossref_primary_10_1109_TITS_2022_3166208 crossref_primary_10_1109_TMM_2019_2907047 crossref_primary_10_1109_ACCESS_2023_3286577 crossref_primary_10_1016_j_neucom_2022_09_052 crossref_primary_10_1109_TMM_2022_3216476 crossref_primary_10_1109_TVT_2020_3003478 crossref_primary_10_1109_TCSVT_2021_3131474 crossref_primary_10_1109_JSTARS_2024_3365729 crossref_primary_10_1109_TIP_2022_3203605 crossref_primary_10_1038_s41467_022_33393_8 crossref_primary_10_1109_TCYB_2019_2932005 crossref_primary_10_1109_TITS_2023_3323468 crossref_primary_10_1109_TIP_2021_3054464 crossref_primary_10_1109_TCSVT_2021_3093890 crossref_primary_10_1109_JSTARS_2022_3199017 crossref_primary_10_1109_TIP_2023_3275069 crossref_primary_10_1007_s11263_022_01734_1 crossref_primary_10_1109_TCSVT_2022_3180274 crossref_primary_10_1109_TGRS_2020_2975380 crossref_primary_10_1111_cgf_14155 crossref_primary_10_1109_TCSVT_2024_3491907 crossref_primary_10_1016_j_measurement_2024_114207 crossref_primary_10_1109_TCSVT_2022_3171563 crossref_primary_10_1109_TMM_2018_2884481 crossref_primary_10_1016_j_eswa_2023_122222 crossref_primary_10_1109_ACCESS_2020_2999633 crossref_primary_10_1109_TMM_2022_3160589 crossref_primary_10_1109_TCSVT_2019_2962073 crossref_primary_10_1109_TGRS_2021_3123984 crossref_primary_10_1109_TETCI_2022_3220250 crossref_primary_10_1109_TIP_2019_2918735 crossref_primary_10_1109_TIP_2021_3050643 crossref_primary_10_1016_j_ins_2020_12_031 crossref_primary_10_1109_LGRS_2020_3006505 crossref_primary_10_1109_TCSVT_2022_3205182 crossref_primary_10_1109_TCSVT_2020_3037688 crossref_primary_10_1049_cvi2_12177 crossref_primary_10_1109_TCSVT_2021_3057368 crossref_primary_10_1109_ACCESS_2020_3039542 crossref_primary_10_1109_TCSVT_2023_3290600 crossref_primary_10_1109_TCYB_2022_3162945 crossref_primary_10_1109_TIP_2020_3014734 crossref_primary_10_3390_rs13245144 crossref_primary_10_1109_JSTARS_2024_3435675 crossref_primary_10_1109_TIP_2020_3042084 crossref_primary_10_1109_TNNLS_2022_3202241 crossref_primary_10_1364_AO_472356 crossref_primary_10_1016_j_neucom_2024_127416 crossref_primary_10_1007_s40544_023_0752_8 crossref_primary_10_1109_TCYB_2020_2969255 crossref_primary_10_1007_s00371_024_03713_8 crossref_primary_10_1109_TETCI_2021_3118043 crossref_primary_10_1109_TIP_2021_3113794 crossref_primary_10_1109_TCSVT_2022_3202563 crossref_primary_10_1109_ACCESS_2022_3166986 crossref_primary_10_1111_exsy_13508 crossref_primary_10_3389_fmars_2022_1094915 crossref_primary_10_1016_j_neucom_2020_05_083 crossref_primary_10_1109_TCYB_2021_3051350 crossref_primary_10_1016_j_neucom_2020_08_038 crossref_primary_10_1109_TGRS_2022_3208618 crossref_primary_10_1109_TPAMI_2021_3060412 crossref_primary_10_1109_TCSVT_2019_2920652 crossref_primary_10_1109_TNNLS_2020_2996406 crossref_primary_10_1007_s11263_021_01478_4 crossref_primary_10_1142_S0129065723500351 crossref_primary_10_1109_TCSVT_2021_3081761 crossref_primary_10_1017_S0263574722000297 crossref_primary_10_1109_TCSVT_2021_3127149 crossref_primary_10_1109_TGRS_2022_3145483 crossref_primary_10_1109_TCSVT_2022_3203421 crossref_primary_10_1016_j_neucom_2020_10_083 crossref_primary_10_1109_TGRS_2023_3235717 crossref_primary_10_1016_j_asoc_2021_107885 crossref_primary_10_3389_fcomp_2022_867289 crossref_primary_10_1109_TIP_2022_3212906 crossref_primary_10_1109_TMM_2021_3054526 crossref_primary_10_1016_j_eswa_2024_125975 crossref_primary_10_1109_TCSVT_2019_2955298 crossref_primary_10_1109_TCSVT_2022_3144852 crossref_primary_10_1016_j_matpr_2022_07_343 crossref_primary_10_1109_TCSVT_2021_3104932 crossref_primary_10_1016_j_oceaneng_2024_118367 crossref_primary_10_1109_ACCESS_2023_3344644 crossref_primary_10_1109_TITS_2022_3208004 crossref_primary_10_1109_ACCESS_2020_2993572 crossref_primary_10_3390_s21051815 crossref_primary_10_1109_TBIOM_2022_3216857 crossref_primary_10_1109_TII_2020_3007792 crossref_primary_10_1016_j_media_2023_102802 crossref_primary_10_1109_TIP_2021_3139232 crossref_primary_10_3390_rs13112163 crossref_primary_10_1016_j_knosys_2022_109675 crossref_primary_10_1007_s11042_021_11555_y crossref_primary_10_1109_TCSVT_2022_3215979 crossref_primary_10_1117_1_JEI_33_1_013001 crossref_primary_10_1007_s00521_020_05313_8 crossref_primary_10_1007_s11432_022_3686_1 crossref_primary_10_1016_j_aei_2024_102780 crossref_primary_10_1109_TCSVT_2022_3185252 crossref_primary_10_1109_TCSVT_2021_3054062 crossref_primary_10_1016_j_patrec_2021_04_010 crossref_primary_10_1016_j_neucom_2021_09_020 crossref_primary_10_1016_j_cviu_2022_103611 crossref_primary_10_3390_app13084675 crossref_primary_10_1016_j_geomat_2024_100039 crossref_primary_10_1109_TCSVT_2021_3099120 crossref_primary_10_1109_TGRS_2023_3242987 crossref_primary_10_1089_3dp_2021_0224 crossref_primary_10_1109_TIP_2021_3061933 crossref_primary_10_1109_TIP_2020_2986712 crossref_primary_10_1016_j_eswa_2022_116805 crossref_primary_10_3390_rs13183601 crossref_primary_10_1109_ACCESS_2024_3402084 crossref_primary_10_1007_s11042_023_15794_z crossref_primary_10_1016_j_autcon_2024_105709 crossref_primary_10_1016_j_imavis_2024_105302 crossref_primary_10_1109_JSTARS_2022_3179461 crossref_primary_10_1016_j_smhl_2023_100414 crossref_primary_10_3390_electronics12061303 crossref_primary_10_1631_FITEE_1900481 crossref_primary_10_1016_j_neucom_2020_12_071 crossref_primary_10_1049_sil2_12041 crossref_primary_10_1155_2022_1849995 crossref_primary_10_1109_TIP_2021_3049959 crossref_primary_10_1109_ACCESS_2019_2942158 crossref_primary_10_1016_j_knosys_2022_109006 crossref_primary_10_1109_TCSVT_2022_3225865 crossref_primary_10_1109_TIP_2022_3216198 crossref_primary_10_1016_j_neucom_2020_04_090 crossref_primary_10_1109_JSTARS_2025_3545681 crossref_primary_10_1145_3571727 crossref_primary_10_1016_j_patcog_2020_107740 crossref_primary_10_1109_TCSVT_2019_2962229 crossref_primary_10_3390_s24092795 crossref_primary_10_1109_TCSVT_2021_3082939 crossref_primary_10_1109_TGRS_2024_3362836 crossref_primary_10_1007_s00371_020_01981_8 crossref_primary_10_1016_j_jvcir_2021_103350 crossref_primary_10_1016_j_knosys_2022_110047 crossref_primary_10_1016_j_commatsci_2023_112262 crossref_primary_10_1016_j_neucom_2021_03_080 crossref_primary_10_1109_TMM_2022_3152567 crossref_primary_10_1016_j_imavis_2020_104001 crossref_primary_10_1016_j_isprsjprs_2022_11_009 crossref_primary_10_1364_AOP_398263 crossref_primary_10_1016_j_eswa_2024_125406 crossref_primary_10_1117_1_JEI_34_1_013005 crossref_primary_10_1145_3485472 crossref_primary_10_1109_TCSVT_2023_3258962 crossref_primary_10_1117_1_JRS_16_046507 crossref_primary_10_1007_s11042_024_18126_x crossref_primary_10_1016_j_neucom_2020_05_108 crossref_primary_10_1109_TGRS_2024_3400032 crossref_primary_10_3390_s24134348 crossref_primary_10_3390_e22101174 crossref_primary_10_1109_TPAMI_2022_3169234 crossref_primary_10_1109_TPAMI_2023_3234586 crossref_primary_10_1109_TCSVT_2021_3126590 crossref_primary_10_1007_s10055_023_00822_y crossref_primary_10_1109_ACCESS_2021_3055647 crossref_primary_10_1109_TCYB_2022_3169431 crossref_primary_10_1109_TPAMI_2023_3287356 crossref_primary_10_1109_TIM_2022_3196430 crossref_primary_10_3390_sym12091397 crossref_primary_10_1007_s10489_021_02479_z crossref_primary_10_1109_TCSVT_2020_2985427 crossref_primary_10_1109_TPAMI_2021_3051099 crossref_primary_10_1007_s11760_022_02323_y crossref_primary_10_1021_acsaom_3c00431 crossref_primary_10_1016_j_neucom_2020_05_110 crossref_primary_10_1109_ACCESS_2022_3225918 crossref_primary_10_1109_TMM_2024_3413529 crossref_primary_10_1080_16583655_2022_2068325 crossref_primary_10_1016_j_patcog_2020_107329 crossref_primary_10_3390_jimaging7090187 crossref_primary_10_1109_TCSVT_2022_3150923 crossref_primary_10_3390_su152014780 crossref_primary_10_1007_s11042_023_17614_w crossref_primary_10_1109_TCSVT_2022_3201510 crossref_primary_10_1109_TMM_2020_2972165 crossref_primary_10_1109_JSTARS_2020_3003137 crossref_primary_10_1109_TCE_2022_3205376 crossref_primary_10_3390_s21123963 crossref_primary_10_1109_TMM_2020_3019688 crossref_primary_10_1109_TGRS_2020_3013673 crossref_primary_10_1109_TIP_2021_3062689 crossref_primary_10_1016_j_neucom_2022_09_019 crossref_primary_10_1080_15230406_2022_2153172 crossref_primary_10_1016_j_engappai_2023_107820 crossref_primary_10_1007_s00371_021_02231_1 crossref_primary_10_1109_ACCESS_2022_3185409 crossref_primary_10_1007_s11042_020_10426_2 crossref_primary_10_1109_TMM_2023_3335884 crossref_primary_10_1109_TIP_2023_3314285 crossref_primary_10_1109_TIP_2023_3263111 crossref_primary_10_1007_s40747_023_01093_5 crossref_primary_10_1109_TGRS_2022_3208348 crossref_primary_10_1016_j_imavis_2023_104792 crossref_primary_10_1049_ipr2_12276 crossref_primary_10_1016_j_image_2021_116218 crossref_primary_10_1007_s11042_021_11891_z crossref_primary_10_1109_TCSVT_2022_3178430 crossref_primary_10_1109_TITS_2020_3044678 crossref_primary_10_1109_ACCESS_2025_3537083 crossref_primary_10_1109_TCSVT_2024_3480691 crossref_primary_10_1109_TCSVT_2024_3486558 crossref_primary_10_1109_TGRS_2021_3131221 crossref_primary_10_1016_j_neucom_2022_03_006 crossref_primary_10_1109_TIP_2020_3028289 crossref_primary_10_1088_2632_2153_abd614 crossref_primary_10_1109_TCSVT_2020_2992054 crossref_primary_10_1109_TGRS_2023_3260825 crossref_primary_10_32604_cmc_2024_046501 crossref_primary_10_1016_j_jvcir_2021_103151 crossref_primary_10_3390_app10155143 crossref_primary_10_1016_j_compeleceng_2021_107071 crossref_primary_10_1109_TMM_2021_3129052 crossref_primary_10_1109_TCSVT_2022_3166914 crossref_primary_10_3390_bs13100827 crossref_primary_10_1109_ACCESS_2020_3036681 crossref_primary_10_1109_TCDS_2021_3051010 crossref_primary_10_1080_22797254_2022_2133745 crossref_primary_10_1109_JSTARS_2024_3435385 crossref_primary_10_1016_j_image_2021_116224 crossref_primary_10_3390_electronics8050481 crossref_primary_10_1109_TMM_2021_3106503 crossref_primary_10_1016_j_patcog_2024_110328 crossref_primary_10_1080_08839514_2022_2094408 crossref_primary_10_3390_electronics12224600 crossref_primary_10_1016_j_trc_2024_104497 crossref_primary_10_1109_TGRS_2022_3173661 crossref_primary_10_1109_TIP_2019_2910377 crossref_primary_10_1109_TIP_2024_3461956 crossref_primary_10_1109_TGRS_2023_3300317 crossref_primary_10_1364_OL_500151 |
Cites_doi | 10.1109/CVPR.2013.271 10.1109/CVPR.2016.121 10.5244/C.24.56 10.1016/j.neucom.2015.11.063 10.1109/ICCV.2017.32 10.1109/ICASSP.2016.7471952 10.1109/TMM.2015.2400823 10.1109/CVPR.2013.412 10.1016/j.image.2016.03.005 10.1109/CVPR.2017.404 10.1109/TCSVT.2015.2433171 10.1109/ICCV.2005.171 10.1109/WACV.2017.8 10.1109/LSP.2016.2557347 10.1109/TIP.2017.2763819 10.1109/CVPR.2018.00941 10.1109/TCYB.2017.2761361 10.1109/CVPR.2016.85 10.1117/1.JRS.9.095050 10.1109/TNNLS.2015.2488637 10.1109/CVPR.2004.291 10.1109/ICCV.2013.248 10.1109/LGRS.2013.2281827 10.1109/CVPR.2013.407 10.1109/TPAMI.2012.89 10.1109/CVPR.2016.78 10.1007/978-3-642-33709-3_8 10.1109/TIP.2017.2682981 10.1109/CVPR.2007.382973 10.1109/LSP.2016.2615293 10.1109/CVPR.2016.118 10.1109/ICCV.2013.273 10.1109/CVPR.2011.5995344 10.1109/TCYB.2017.2761775 10.1109/CVPR.2015.7298961 10.1145/2808492.2808551 10.1109/TIP.2017.2754941 10.1109/TCSVT.2017.2706264 10.1109/TIP.2015.2460013 10.1007/s11263-016-0977-3 10.1109/CVPR.2014.360 10.1109/IGARSS.2016.7729013 10.1109/TIP.2015.2425544 10.1145/3158674 10.1109/TIP.2017.2656463 10.1109/ICME.2015.7177414 10.1007/s11263-015-0822-0 10.1109/ICIP.2006.313095 10.1109/TIP.2011.2156803 10.1109/TIP.2015.2438546 10.1109/TPAMI.2010.70 10.1109/LGRS.2016.2602885 10.1007/978-3-319-10578-9_7 10.1007/s11263-016-0907-4 10.1109/TNNLS.2015.2506664 10.1007/s10844-016-0441-4 10.1109/TIP.2015.2487833 10.1109/TIP.2016.2612882 10.1109/TIP.2013.2260166 10.1109/CVPR.2011.5995415 10.1109/CVPR.2009.5206596 10.1016/j.neucom.2012.08.057 10.1109/TMM.2017.2660440 10.1109/LSP.2017.2688136 10.1109/ICME.2018.8486603 10.1109/ICME.2012.173 10.1109/TCSVT.2016.2617332 10.1109/LSP.2017.2681687 10.1109/TMM.2015.2389616 10.1109/TCYB.2017.2771488 10.1109/TMM.2016.2592325 10.1109/JETCAS.2014.2298919 10.1109/TCSVT.2014.2308642 10.1109/ICASSP.2012.6288171 10.1109/TCSVT.2016.2595324 10.1109/CVPR.2016.58 10.1109/CVPR.2017.468 10.1016/j.neucom.2016.05.045 10.1109/TIP.2016.2631900 10.1109/ICIP.2017.8296539 10.1109/TIP.2017.2670143 10.1109/ICME.2014.6890183 10.1109/TIP.2014.2305100 10.3390/rs9060597 10.1109/CVPR.2015.7298918 10.1109/TPAMI.2017.2662005 10.1109/TIP.2015.2500820 10.1109/CVPR.2013.87 10.1109/TIP.2017.2762594 10.5244/C.31.38 10.1109/ICME.2016.7552907 10.1109/TIP.2017.2711277 10.1109/TCSVT.2013.2273613 10.1109/TVCG.2016.2600594 10.1109/CVPR.2015.7299072 10.1109/ICCV.2013.370 10.1109/ICASSP.2013.6638027 10.1109/TMM.2016.2547343 10.1109/CVPR.2016.80 10.1109/CVPR.2017.563 10.1109/TIP.2017.2651369 10.1016/j.neucom.2016.05.098 10.1109/TPAMI.2015.2465960 10.1109/TCYB.2013.2265378 10.1109/LSP.2013.2292873 10.1109/LSP.2016.2611485 10.1109/TIP.2015.2495122 10.1109/LGRS.2017.2672560 10.1109/TIP.2014.2332399 10.1109/ICASSP.2016.7471896 10.1109/CVPR.2018.00322 10.1109/TPAMI.2016.2562626 10.1109/CVPR.2016.82 10.1109/LSP.2014.2364896 10.1109/CVPR.2015.7298731 10.24963/ijcai.2017/424 10.1109/TMM.2016.2636739 10.1016/j.image.2015.07.002 10.1109/TIP.2014.2336549 10.1109/TIP.2016.2526900 10.1109/CVPR.2016.257 10.1109/CVPR.2010.5540080 10.1109/TMM.2013.2271476 10.1109/ICCV.2015.75 10.1109/CVPR.2014.43 |
ContentType | Journal Article |
Copyright | Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2019 |
Copyright_xml | – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2019 |
DBID | 97E RIA RIE AAYXX CITATION 7SC 7SP 8FD JQ2 L7M L~C L~D |
DOI | 10.1109/TCSVT.2018.2870832 |
DatabaseName | IEEE All-Society Periodicals Package (ASPP) 2005-present IEEE All-Society Periodicals Package (ASPP) 1998-Present IEEE Electronic Library (IEL) CrossRef Computer and Information Systems Abstracts Electronics & Communications Abstracts Technology Research Database ProQuest Computer Science Collection Advanced Technologies Database with Aerospace Computer and Information Systems Abstracts Academic Computer and Information Systems Abstracts Professional |
DatabaseTitle | CrossRef Technology Research Database Computer and Information Systems Abstracts – Academic Electronics & Communications Abstracts ProQuest Computer Science Collection Computer and Information Systems Abstracts Advanced Technologies Database with Aerospace Computer and Information Systems Abstracts Professional |
DatabaseTitleList | Technology Research Database |
Database_xml | – sequence: 1 dbid: RIE name: IEEE Electronic Library (IEL) url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/ sourceTypes: Publisher |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Engineering |
EISSN | 1558-2205 |
EndPage | 2959 |
ExternalDocumentID | 10_1109_TCSVT_2018_2870832 8466906 |
Genre | orig-research |
GrantInformation_xml | – fundername: National Key Research and Development Program of China grantid: 2017YFB1002900 – fundername: National Natural Science Foundation of China grantid: 61722112; 61520106002; 61731003; 61332016; 61620106009; U1636214; 61602344 funderid: 10.13039/501100001809 |
GroupedDBID | -~X 0R~ 29I 4.4 5GY 5VS 6IK 97E AAJGR AARMG AASAJ AAWTH ABAZT ABQJQ ABVLG ACGFO ACGFS ACIWK AENEX AETIX AGQYO AGSQL AHBIQ AI. AIBXA AKJIK AKQYR ALLEH ALMA_UNASSIGNED_HOLDINGS ASUFR ATWAV BEFXN BFFAM BGNUA BKEBE BPEOZ CS3 DU5 EBS EJD HZ~ H~9 ICLAB IFIPE IFJZH IPLJI JAVBF LAI M43 O9- OCL P2P RIA RIE RNS RXW TAE TN5 VH1 AAYXX CITATION RIG 7SC 7SP 8FD JQ2 L7M L~C L~D |
ID | FETCH-LOGICAL-c295t-e4b7c2681efea4d970c3080eaae6835801d7a0e50dd2bd91d946bc64523cec5d3 |
IEDL.DBID | RIE |
ISSN | 1051-8215 |
IngestDate | Mon Jun 30 10:11:53 EDT 2025 Tue Jul 01 00:41:12 EDT 2025 Thu Apr 24 23:03:33 EDT 2025 Wed Aug 27 08:29:16 EDT 2025 |
IsPeerReviewed | true |
IsScholarly | true |
Issue | 10 |
Language | English |
License | https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html https://doi.org/10.15223/policy-029 https://doi.org/10.15223/policy-037 |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-c295t-e4b7c2681efea4d970c3080eaae6835801d7a0e50dd2bd91d946bc64523cec5d3 |
Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
ORCID | 0000-0002-9702-5524 0000-0001-5550-8758 0000-0003-0972-4008 0000-0001-9866-1947 0000-0003-3171-7680 0000-0001-7542-296X |
PQID | 2300330649 |
PQPubID | 85433 |
PageCount | 19 |
ParticipantIDs | proquest_journals_2300330649 crossref_citationtrail_10_1109_TCSVT_2018_2870832 ieee_primary_8466906 crossref_primary_10_1109_TCSVT_2018_2870832 |
ProviderPackageCode | CITATION AAYXX |
PublicationCentury | 2000 |
PublicationDate | 2019-10-01 |
PublicationDateYYYYMMDD | 2019-10-01 |
PublicationDate_xml | – month: 10 year: 2019 text: 2019-10-01 day: 01 |
PublicationDecade | 2010 |
PublicationPlace | New York |
PublicationPlace_xml | – name: New York |
PublicationTitle | IEEE transactions on circuits and systems for video technology |
PublicationTitleAbbrev | TCSVT |
PublicationYear | 2019 |
Publisher | IEEE The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Publisher_xml | – name: IEEE – name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
References | ref57 ref56 ref59 ref58 ref53 ref51 ref50 ref46 ref45 ref48 ref42 ref41 ref44 ref43 qin (ref25) 2015 ref49 li (ref47) 2015 han (ref52) 2013; 43 ref8 ref7 ref9 ref3 ref6 ref5 ref100 ref101 ref40 ref35 ref37 ref36 ref31 ref30 ref33 ref32 ref39 (ref54) 2018 ref38 tao (ref4) 2017 le (ref106) 2017 ref24 ref23 ref20 ref22 ref21 ref28 ref27 ref29 ref13 ref12 ref128 ref15 ref129 ref14 ref126 ref97 ref127 ref96 ref124 ref99 ref11 ref125 ref98 ref10 ref17 ref16 ref19 ref18 ref133 ref93 ref92 ref131 ref95 ref132 ref94 niu (ref55) 2012 ref130 ref91 ref90 ref89 ref86 ref85 ref88 ref87 ref82 ref81 ref84 ref83 ref80 ref79 ref108 ref78 ref109 ref107 ref75 ref104 ref74 ref105 ref77 ref102 ref76 ref103 ref2 ref1 ref71 ref111 ref70 ref73 ref72 ref110 ref68 ref119 ref67 ref117 ref69 ref118 ref64 ref115 ref63 ref116 ref66 ref113 ref65 ref114 liu (ref34) 2011; 33 (ref112) 2013 li (ref26) 2015 ref60 ref122 ref123 ref62 ref120 ref61 ref121 |
References_xml | – ident: ref35 doi: 10.1109/CVPR.2013.271 – ident: ref132 doi: 10.1109/CVPR.2016.121 – ident: ref118 doi: 10.5244/C.24.56 – ident: ref16 doi: 10.1016/j.neucom.2015.11.063 – ident: ref44 doi: 10.1109/ICCV.2017.32 – ident: ref67 doi: 10.1109/ICASSP.2016.7471952 – ident: ref10 doi: 10.1109/TMM.2015.2400823 – ident: ref110 doi: 10.1109/CVPR.2013.412 – ident: ref83 doi: 10.1016/j.image.2016.03.005 – ident: ref48 doi: 10.1109/CVPR.2017.404 – ident: ref131 doi: 10.1109/TCSVT.2015.2433171 – ident: ref114 doi: 10.1109/ICCV.2005.171 – ident: ref42 doi: 10.1109/WACV.2017.8 – ident: ref70 doi: 10.1109/LSP.2016.2557347 – ident: ref91 doi: 10.1109/TIP.2017.2763819 – ident: ref49 doi: 10.1109/CVPR.2018.00941 – ident: ref105 doi: 10.1109/TCYB.2017.2761361 – ident: ref120 doi: 10.1109/CVPR.2016.85 – ident: ref122 doi: 10.1117/1.JRS.9.095050 – ident: ref14 doi: 10.1109/TNNLS.2015.2488637 – start-page: 110 year: 2015 ident: ref25 article-title: Saliency detection via cellular automata publication-title: Proc CVPR – ident: ref53 doi: 10.1109/CVPR.2004.291 – ident: ref23 doi: 10.1109/ICCV.2013.248 – ident: ref121 doi: 10.1109/LGRS.2013.2281827 – ident: ref32 doi: 10.1109/CVPR.2013.407 – ident: ref1 doi: 10.1109/TPAMI.2012.89 – ident: ref39 doi: 10.1109/CVPR.2016.78 – ident: ref51 doi: 10.1007/978-3-642-33709-3_8 – ident: ref60 doi: 10.1109/TIP.2017.2682981 – ident: ref111 doi: 10.1109/CVPR.2007.382973 – ident: ref90 doi: 10.1109/LSP.2016.2615293 – ident: ref133 doi: 10.1109/CVPR.2016.118 – ident: ref119 doi: 10.1109/ICCV.2013.273 – ident: ref20 doi: 10.1109/CVPR.2011.5995344 – ident: ref61 doi: 10.1109/TCYB.2017.2761775 – ident: ref100 doi: 10.1109/CVPR.2015.7298961 – ident: ref63 doi: 10.1145/2808492.2808551 – start-page: 4285 year: 2017 ident: ref4 article-title: Image cosegmentation via saliency-guided constrained clustering with cosine similarity publication-title: Proc AAAI – ident: ref107 doi: 10.1109/TIP.2017.2754941 – ident: ref88 doi: 10.1109/TCSVT.2017.2706264 – ident: ref98 doi: 10.1109/TIP.2015.2460013 – ident: ref36 doi: 10.1007/s11263-016-0977-3 – ident: ref24 doi: 10.1109/CVPR.2014.360 – ident: ref123 doi: 10.1109/IGARSS.2016.7729013 – ident: ref99 doi: 10.1109/TIP.2015.2425544 – ident: ref71 doi: 10.1145/3158674 – ident: ref12 doi: 10.1109/TIP.2017.2656463 – ident: ref82 doi: 10.1109/ICME.2015.7177414 – ident: ref38 doi: 10.1007/s11263-015-0822-0 – ident: ref9 doi: 10.1109/ICIP.2006.313095 – ident: ref72 doi: 10.1109/TIP.2011.2156803 – ident: ref28 doi: 10.1109/TIP.2015.2438546 – start-page: 454 year: 2012 ident: ref55 article-title: Leveraging stereopsis for saliency analysis publication-title: Proc CVPR – volume: 33 start-page: 353 year: 2011 ident: ref34 article-title: Learning to detect a salient object publication-title: IEEE Trans Pattern Anal Mach Intell doi: 10.1109/TPAMI.2010.70 – ident: ref124 doi: 10.1109/LGRS.2016.2602885 – ident: ref58 doi: 10.1007/978-3-319-10578-9_7 – ident: ref116 doi: 10.1007/s11263-016-0907-4 – ident: ref37 doi: 10.1109/TNNLS.2015.2506664 – ident: ref19 doi: 10.1007/s10844-016-0441-4 – ident: ref2 doi: 10.1109/TIP.2015.2487833 – ident: ref11 doi: 10.1109/TIP.2016.2612882 – ident: ref80 doi: 10.1109/TIP.2013.2260166 – ident: ref73 doi: 10.1109/CVPR.2011.5995415 – ident: ref50 doi: 10.1109/CVPR.2009.5206596 – ident: ref56 doi: 10.1016/j.neucom.2012.08.057 – ident: ref7 doi: 10.1109/TMM.2017.2660440 – ident: ref69 doi: 10.1109/LSP.2017.2688136 – ident: ref79 doi: 10.1109/ICME.2018.8486603 – ident: ref93 doi: 10.1109/ICME.2012.173 – ident: ref13 doi: 10.1109/TCSVT.2016.2617332 – ident: ref84 doi: 10.1109/LSP.2017.2681687 – ident: ref5 doi: 10.1109/TMM.2015.2389616 – year: 2018 ident: ref54 publication-title: Stereo Cameras – year: 2013 ident: ref112 publication-title: MSRA10K – ident: ref92 doi: 10.1109/TCYB.2017.2771488 – ident: ref30 doi: 10.1109/TMM.2016.2592325 – ident: ref8 doi: 10.1109/JETCAS.2014.2298919 – ident: ref97 doi: 10.1109/TCSVT.2014.2308642 – ident: ref94 doi: 10.1109/ICASSP.2012.6288171 – ident: ref104 doi: 10.1109/TCSVT.2016.2595324 – ident: ref40 doi: 10.1109/CVPR.2016.58 – start-page: 465 year: 2017 ident: ref106 article-title: Spatiotemporal utilization of deep features for video saliency detection publication-title: Proc ICME – ident: ref113 doi: 10.1109/CVPR.2017.468 – ident: ref125 doi: 10.1016/j.neucom.2016.05.045 – ident: ref102 doi: 10.1109/TIP.2016.2631900 – ident: ref45 doi: 10.1109/ICIP.2017.8296539 – start-page: 5455 year: 2015 ident: ref47 article-title: Visual saliency based on multiscale deep features publication-title: Proc CVPR – ident: ref103 doi: 10.1109/TIP.2017.2670143 – ident: ref76 doi: 10.1109/ICME.2014.6890183 – start-page: 2710 year: 2015 ident: ref26 article-title: Robust saliency detection via regularized random walks ranking publication-title: Proc CVPR – ident: ref57 doi: 10.1109/TIP.2014.2305100 – ident: ref127 doi: 10.3390/rs9060597 – ident: ref85 doi: 10.1109/CVPR.2015.7298918 – ident: ref101 doi: 10.1109/TPAMI.2017.2662005 – ident: ref130 doi: 10.1109/TIP.2015.2500820 – ident: ref128 doi: 10.1109/CVPR.2013.87 – ident: ref109 doi: 10.1109/TIP.2017.2762594 – ident: ref108 doi: 10.5244/C.31.38 – ident: ref65 doi: 10.1109/ICME.2016.7552907 – ident: ref59 doi: 10.1109/TIP.2017.2711277 – ident: ref95 doi: 10.1109/TCSVT.2013.2273613 – ident: ref17 doi: 10.1109/TVCG.2016.2600594 – ident: ref89 doi: 10.1109/CVPR.2015.7299072 – ident: ref22 doi: 10.1109/ICCV.2013.370 – ident: ref74 doi: 10.1109/ICASSP.2013.6638027 – ident: ref15 doi: 10.1109/TMM.2016.2547343 – ident: ref41 doi: 10.1109/CVPR.2016.80 – ident: ref43 doi: 10.1109/CVPR.2017.563 – ident: ref3 doi: 10.1109/TIP.2017.2651369 – ident: ref6 doi: 10.1016/j.neucom.2016.05.098 – ident: ref21 doi: 10.1109/TPAMI.2015.2465960 – volume: 43 start-page: 1318 year: 2013 ident: ref52 article-title: Enhanced computer vision with microsoft Kinect sensor: A review publication-title: IEEE Trans Cybern doi: 10.1109/TCYB.2013.2265378 – ident: ref77 doi: 10.1109/LSP.2013.2292873 – ident: ref18 doi: 10.1109/LSP.2016.2611485 – ident: ref27 doi: 10.1109/TIP.2015.2495122 – ident: ref126 doi: 10.1109/LGRS.2017.2672560 – ident: ref81 doi: 10.1109/TIP.2014.2332399 – ident: ref68 doi: 10.1109/ICASSP.2016.7471896 – ident: ref62 doi: 10.1109/CVPR.2018.00322 – ident: ref29 doi: 10.1109/TPAMI.2016.2562626 – ident: ref129 doi: 10.1109/CVPR.2016.82 – ident: ref78 doi: 10.1109/LSP.2014.2364896 – ident: ref46 doi: 10.1109/CVPR.2015.7298731 – ident: ref86 doi: 10.24963/ijcai.2017/424 – ident: ref31 doi: 10.1109/TMM.2016.2636739 – ident: ref64 doi: 10.1016/j.image.2015.07.002 – ident: ref96 doi: 10.1109/TIP.2014.2336549 – ident: ref117 doi: 10.1109/TIP.2016.2526900 – ident: ref66 doi: 10.1109/CVPR.2016.257 – ident: ref115 doi: 10.1109/CVPR.2010.5540080 – ident: ref75 doi: 10.1109/TMM.2013.2271476 – ident: ref87 doi: 10.1109/ICCV.2015.75 – ident: ref33 doi: 10.1109/CVPR.2014.43 |
SSID | ssj0014847 |
Score | 2.6707678 |
SecondaryResourceType | review_article |
Snippet | The visual saliency detection model simulates the human visual system to perceive the scene and has been widely used in many vision tasks. With the development... |
SourceID | proquest crossref ieee |
SourceType | Aggregation Database Enrichment Source Index Database Publisher |
StartPage | 2941 |
SubjectTerms | Algorithms co-saliency detection Computer simulation depth attribute Feature extraction Image acquisition Image color analysis Image detection Imaging Integrated circuit modeling inter-image correspondence RGBD saliency detection Salience Saliency detection Salient object spatiotemporal constraint video saliency detection Visual perception Visual systems Visualization |
Title | Review of Visual Saliency Detection With Comprehensive Information |
URI | https://ieeexplore.ieee.org/document/8466906 https://www.proquest.com/docview/2300330649 |
Volume | 29 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1LT8MwDLa2neDAayAGA_XADdr1mSZHGEwT0rjswW5Vk7raBNrQ1l749STpQ7yEuFWqXUV2Etv1ZxvgSoRI_JCgiSmLTZ-gZ8YxVbAd15UGPg1T3Uxn9ESGU_9xHswbcFPXwiCiBp-hpR51Lj9Zi1z9KutJW6n66jahKQO3olarzhj4VA8Tk-6CY1Jpx6oCGZv1Jv3xbKJQXNRSaT3quV-MkJ6q8uMq1vZlsA-jamUFrOTFyjNuifdvTRv_u_QD2CsdTeO22BmH0MDVEex-aj_YhrsiMWCsU2O23OaSeiy9clWLadxjpjFaK-N5mS0MdWtscFGA3Y2yhEm9Pobp4GHSH5rlTAVTuCzITPR5KFxCHUwx9hMW2sKTTiPGMRKqUqJOEsY2BnaSuDxhTsJ8woXKfnoCRZB4J9BarVd4CkZKuGRnaHuc-5xyTgRRHgDK-JyGLnbAqYQcibLhuJp78RrpwMNmkVZMpBQTlYrpwHXN81a02_iTuq0kXVOWQu5At9JlVJ7IbSRDLdtT4RY7-53rHHbkt1kB1OtCK9vkeCEdjoxf6p32AZ740ZE |
linkProvider | IEEE |
linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1LT8MwDLbGOAAHXgMxGNADN-joM02OMJgGbLvswW5Vk7raBNrQ1l749SRpN_ES4lapthrZSWzXn22ACxEg8QKCJiYsMj2CrhlFVMF2HEca-CRIdDOdTpe0Bt7jyB-V4GpVC4OIGnyGdfWoc_nxTGTqV9m1tJWqr-4arEu779t5tdYqZ-BRPU5MOgy2SaUlW5bIWOy63-gN-wrHResqsUdd54sZ0nNVflzG2sI0d6CzXFsOLHmpZymvi_dvbRv_u_hd2C5cTeMm3xt7UMLpPmx9akBYgds8NWDMEmM4WWSSuif9clWNadxhqlFaU-N5ko4NdW_McZzD3Y2iiEm9PoBB877faJnFVAVTOMxPTfR4IBxCbUww8mIWWMKVbiNGERKqkqJ2HEQW-lYcOzxmdsw8woXKf7oChR-7h1CezqZ4BEZCuGRnaLmce5xyTgRRPgDKCJ0GDlbBXgo5FEXLcTX54jXUoYfFQq2YUCkmLBRThcsVz1vecONP6oqS9IqyEHIVaktdhsWZXIQy2LJcFXCx49-5zmGj1e-0w_ZD9-kENuV3WA7bq0E5nWd4Kt2PlJ_pXfcBuOXU2g |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Review+of+Visual+Saliency+Detection+With+Comprehensive+Information&rft.jtitle=IEEE+transactions+on+circuits+and+systems+for+video+technology&rft.au=Cong%2C+Runmin&rft.au=Lei%2C+Jianjun&rft.au=Fu%2C+Huazhu&rft.au=Cheng%2C+Ming-Ming&rft.date=2019-10-01&rft.pub=IEEE&rft.issn=1051-8215&rft.volume=29&rft.issue=10&rft.spage=2941&rft.epage=2959&rft_id=info:doi/10.1109%2FTCSVT.2018.2870832&rft.externalDocID=8466906 |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1051-8215&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1051-8215&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1051-8215&client=summon |