ECFFNet: Effective and Consistent Feature Fusion Network for RGB-T Salient Object Detection
Under ideal environmental conditions, RGB-based deep convolutional neural networks can achieve high performance for salient object detection (SOD). In scenes with cluttered backgrounds and many objects, depth maps have been combined with RGB images to better distinguish spatial positions and structu...
Saved in:
Published in | IEEE transactions on circuits and systems for video technology Vol. 32; no. 3; pp. 1224 - 1235 |
---|---|
Main Authors | , , , , |
Format | Journal Article |
Language | English |
Published |
New York
IEEE
01.03.2022
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Abstract | Under ideal environmental conditions, RGB-based deep convolutional neural networks can achieve high performance for salient object detection (SOD). In scenes with cluttered backgrounds and many objects, depth maps have been combined with RGB images to better distinguish spatial positions and structures during SOD, achieving high accuracy. However, under low-light and uneven lighting conditions, RGB and depth information may be insufficient for detection. Thermal images are insensitive to lighting and weather conditions, being able to capture important objects even during nighttime. By combining thermal images and RGB images, we propose an effective and consistent feature fusion network (ECFFNet) for RGB-T SOD. In ECFFNet, an effective cross-modality fusion module fully fuses features of corresponding sizes from the RGB and thermal modalities. Then, a bilateral reversal fusion module performs bilateral fusion of foreground and background information, enabling the full extraction of salient object boundaries. Finally, a multilevel consistent fusion module combines features across different levels to obtain complementary information. Comprehensive experiments on three RGB-T SOD datasets show that the proposed ECFFNet outperforms 12 state-of-the-art methods under different evaluation indicators. |
---|---|
AbstractList | Under ideal environmental conditions, RGB-based deep convolutional neural networks can achieve high performance for salient object detection (SOD). In scenes with cluttered backgrounds and many objects, depth maps have been combined with RGB images to better distinguish spatial positions and structures during SOD, achieving high accuracy. However, under low-light and uneven lighting conditions, RGB and depth information may be insufficient for detection. Thermal images are insensitive to lighting and weather conditions, being able to capture important objects even during nighttime. By combining thermal images and RGB images, we propose an effective and consistent feature fusion network (ECFFNet) for RGB-T SOD. In ECFFNet, an effective cross-modality fusion module fully fuses features of corresponding sizes from the RGB and thermal modalities. Then, a bilateral reversal fusion module performs bilateral fusion of foreground and background information, enabling the full extraction of salient object boundaries. Finally, a multilevel consistent fusion module combines features across different levels to obtain complementary information. Comprehensive experiments on three RGB-T SOD datasets show that the proposed ECFFNet outperforms 12 state-of-the-art methods under different evaluation indicators. |
Author | Zhou, Wujie Hwang, Jenq-Neng Lei, Jingsheng Yu, Lu Guo, Qinling |
Author_xml | – sequence: 1 givenname: Wujie orcidid: 0000-0002-3055-2493 surname: Zhou fullname: Zhou, Wujie email: wujiezhou@163.com organization: School of Information and Electronic Engineering, Zhejiang University of Science and Technology, Hangzhou, China – sequence: 2 givenname: Qinling surname: Guo fullname: Guo, Qinling organization: School of Information and Electronic Engineering, Zhejiang University of Science and Technology, Hangzhou, China – sequence: 3 givenname: Jingsheng surname: Lei fullname: Lei, Jingsheng organization: School of Information and Electronic Engineering, Zhejiang University of Science and Technology, Hangzhou, China – sequence: 4 givenname: Lu surname: Yu fullname: Yu, Lu organization: Institute of Information and Communication Engineering, Zhejiang University, Hangzhou, China – sequence: 5 givenname: Jenq-Neng orcidid: 0000-0002-8877-2421 surname: Hwang fullname: Hwang, Jenq-Neng organization: Department of Electrical Engineering, University of Washington, Seattle, WA, USA |
BookMark | eNp9kL1O7DAQRi0EErDwAtBYunWWGf8kDt0lbAAJgQQLDUXkTSZSlr0x2F4Qb0_ColtQUM0U3_lmdPbZdu96YuwIYYoI-cm8uH-cTwUInErIMtBmi-2h1iYRAvT2sIPGxAjUu2w_hCUAKqOyPfY0K8ryhuIpn7Ut1bF7I277hheuD12I1Edeko1rT7xch871fAi_O__MW-f53cVZMuf3dtWNwdvFcmjg5xTHItcfsJ3WrgIdfs8Jeyhn8-Iyub69uCr-Xie1yHVMJIFqMzJ5mreUg2zIKFUbtA02WtYIioRVhLLVFvTCoGkymyMuDEBNqZET9mfT--Ld65pCrJZu7fvhZCVSmSkljMQhZTap2rsQPLVV3UU7_hm97VYVQjWqrL5UVqPK6lvlgIof6Ivv_ln_8Tt0vIE6IvoP5EpAmgr5CUzEgE4 |
CODEN | ITCTEM |
CitedBy_id | crossref_primary_10_1016_j_engappai_2023_106885 crossref_primary_10_1016_j_measurement_2025_116920 crossref_primary_10_1109_TIM_2022_3185323 crossref_primary_10_1109_TCSVT_2021_3099120 crossref_primary_10_1109_TMM_2023_3291823 crossref_primary_10_1109_LSP_2021_3084855 crossref_primary_10_1109_TCSVT_2022_3142771 crossref_primary_10_1109_TIM_2024_3419115 crossref_primary_10_1109_TIM_2023_3236346 crossref_primary_10_1016_j_jksuci_2023_101702 crossref_primary_10_1109_LSP_2021_3102524 crossref_primary_10_1109_TCSVT_2024_3489440 crossref_primary_10_1016_j_neucom_2024_128149 crossref_primary_10_1016_j_engappai_2022_105162 crossref_primary_10_1109_TCSVT_2023_3241196 crossref_primary_10_1016_j_ijleo_2022_170204 crossref_primary_10_1016_j_inffus_2023_02_020 crossref_primary_10_1016_j_neucom_2022_07_041 crossref_primary_10_1016_j_neucom_2022_09_052 crossref_primary_10_1109_TIP_2023_3234702 crossref_primary_10_1109_JSEN_2023_3333322 crossref_primary_10_1109_LSP_2021_3075610 crossref_primary_10_1109_TMM_2022_3216476 crossref_primary_10_1016_j_dsp_2024_104439 crossref_primary_10_1109_TIM_2022_3193971 crossref_primary_10_1364_OE_480252 crossref_primary_10_1016_j_infrared_2025_105804 crossref_primary_10_3390_electronics12091976 crossref_primary_10_1049_cvi2_12307 crossref_primary_10_1109_TMECH_2022_3215909 crossref_primary_10_1016_j_engappai_2022_105707 crossref_primary_10_1109_TCSVT_2023_3281419 crossref_primary_10_1016_j_engappai_2024_109902 crossref_primary_10_1109_TCSVT_2024_3367808 crossref_primary_10_1007_s11042_023_15794_z crossref_primary_10_1117_1_JEI_33_3_033010 crossref_primary_10_1109_TIP_2022_3176540 crossref_primary_10_1109_TCSVT_2021_3093890 crossref_primary_10_1109_TCSVT_2023_3289142 crossref_primary_10_1109_TMM_2021_3077767 crossref_primary_10_1109_TNNLS_2024_3358858 crossref_primary_10_1109_TIV_2022_3164899 crossref_primary_10_1109_TMM_2024_3369922 crossref_primary_10_1109_TITS_2024_3387949 crossref_primary_10_1007_s00371_025_03855_3 crossref_primary_10_1109_TITS_2022_3203385 crossref_primary_10_1016_j_compind_2025_104252 crossref_primary_10_1117_1_JEI_32_6_063032 crossref_primary_10_1016_j_engappai_2025_110245 crossref_primary_10_1109_TCSVT_2023_3275314 crossref_primary_10_3390_s24092795 crossref_primary_10_1016_j_neunet_2025_107244 crossref_primary_10_1016_j_knosys_2022_110047 crossref_primary_10_1016_j_neucom_2025_129691 crossref_primary_10_1016_j_eswa_2024_123222 crossref_primary_10_1007_s11263_024_02020_y crossref_primary_10_1016_j_dsp_2024_104579 crossref_primary_10_3390_s23104849 crossref_primary_10_1007_s00371_023_02773_6 crossref_primary_10_1117_1_JEI_34_1_013005 crossref_primary_10_1016_j_engappai_2022_105640 crossref_primary_10_1016_j_engappai_2023_107201 crossref_primary_10_1109_JSTSP_2022_3174338 crossref_primary_10_1109_LSP_2022_3229594 crossref_primary_10_1016_j_neucom_2025_129718 crossref_primary_10_1109_ACCESS_2021_3092191 crossref_primary_10_1109_TGRS_2024_3400032 crossref_primary_10_1109_TGRS_2021_3109626 crossref_primary_10_1155_2021_6610997 crossref_primary_10_1016_j_neucom_2023_126535 crossref_primary_10_3390_s22249948 crossref_primary_10_1109_TITS_2023_3242651 crossref_primary_10_1109_TPAMI_2024_3511621 crossref_primary_10_3390_fi15060205 crossref_primary_10_1109_TCSVT_2024_3412093 crossref_primary_10_3390_s24248159 crossref_primary_10_1016_j_engappai_2022_105510 crossref_primary_10_1016_j_eswa_2024_125278 crossref_primary_10_1109_LSP_2022_3211199 crossref_primary_10_1109_TCE_2024_3390841 crossref_primary_10_1109_TMM_2021_3086618 crossref_primary_10_1109_JIOT_2024_3420449 crossref_primary_10_1016_j_engappai_2023_106729 crossref_primary_10_1145_3656476 crossref_primary_10_1109_TCSVT_2023_3234340 crossref_primary_10_1007_s10489_023_04784_1 crossref_primary_10_1016_j_neucom_2025_129558 crossref_primary_10_1016_j_eswa_2024_126083 crossref_primary_10_1109_TCSVT_2024_3418965 crossref_primary_10_1109_TMM_2024_3410542 crossref_primary_10_1016_j_engappai_2023_105919 crossref_primary_10_1016_j_measurement_2023_113180 crossref_primary_10_1109_TETCI_2021_3118043 crossref_primary_10_1016_j_dsp_2024_104807 crossref_primary_10_1109_TCSVT_2022_3202563 crossref_primary_10_1109_TIM_2024_3373104 crossref_primary_10_1109_LSP_2021_3092967 crossref_primary_10_1109_TASE_2024_3410182 crossref_primary_10_1109_TCSVT_2024_3375505 crossref_primary_10_1109_TMM_2022_3161852 crossref_primary_10_1016_j_dsp_2023_104011 crossref_primary_10_1016_j_patcog_2023_110043 crossref_primary_10_1109_LSP_2023_3270759 crossref_primary_10_1109_TCSVT_2021_3124952 crossref_primary_10_1016_j_jvcir_2022_103727 crossref_primary_10_1109_LSP_2021_3139567 crossref_primary_10_1109_TIM_2024_3418111 crossref_primary_10_1016_j_patcog_2024_110868 crossref_primary_10_1016_j_jvcir_2022_103725 crossref_primary_10_1109_JSTARS_2023_3243247 crossref_primary_10_1109_TCSVT_2021_3102268 crossref_primary_10_1109_TCSVT_2021_3127149 crossref_primary_10_1109_TCSVT_2023_3253773 crossref_primary_10_1016_j_dsp_2022_103827 crossref_primary_10_1109_TCSVT_2022_3208833 crossref_primary_10_1016_j_cviu_2023_103917 crossref_primary_10_1109_TITS_2022_3146087 crossref_primary_10_1109_TCSVT_2023_3303574 crossref_primary_10_1109_TETCI_2022_3160720 crossref_primary_10_1109_TCSVT_2022_3229359 crossref_primary_10_1145_3624984 crossref_primary_10_1016_j_inffus_2025_103025 crossref_primary_10_1016_j_dsp_2023_104221 crossref_primary_10_1109_TITS_2023_3306368 crossref_primary_10_1016_j_aej_2025_02_094 crossref_primary_10_1007_s00521_022_07818_w crossref_primary_10_1016_j_foodchem_2023_136309 crossref_primary_10_1007_s10489_021_02804_6 crossref_primary_10_1109_TCSVT_2022_3233131 crossref_primary_10_1109_LSP_2024_3461648 crossref_primary_10_1109_TCSVT_2022_3216313 crossref_primary_10_1007_s11042_023_17219_3 crossref_primary_10_1007_s00371_023_02870_6 crossref_primary_10_1016_j_patcog_2022_108712 crossref_primary_10_1007_s10489_021_02639_1 crossref_primary_10_1016_j_knosys_2023_110322 crossref_primary_10_1109_TIP_2023_3275538 crossref_primary_10_1109_TIP_2023_3270801 crossref_primary_10_1109_TCSVT_2022_3166914 crossref_primary_10_1007_s00138_022_01312_y crossref_primary_10_1109_TCSVT_2022_3184840 crossref_primary_10_3390_s24227146 crossref_primary_10_1016_j_inffus_2024_102266 crossref_primary_10_1007_s10489_022_03950_1 crossref_primary_10_1109_TPAMI_2024_3388153 crossref_primary_10_3390_rs14215510 crossref_primary_10_1016_j_knosys_2022_110007 crossref_primary_10_1109_LSP_2024_3508538 crossref_primary_10_1109_TIP_2023_3242775 crossref_primary_10_1016_j_optlastec_2024_111666 crossref_primary_10_1109_TIM_2024_3374294 crossref_primary_10_1016_j_optlaseng_2023_107842 crossref_primary_10_1109_LSP_2022_3219350 crossref_primary_10_1109_TCSVT_2022_3215979 crossref_primary_10_1109_TIP_2023_3256762 crossref_primary_10_1007_s00371_025_03859_z crossref_primary_10_1109_TMM_2023_3275308 crossref_primary_10_1109_LRA_2023_3272269 crossref_primary_10_3390_e26020130 crossref_primary_10_1016_j_eswa_2025_127004 crossref_primary_10_1016_j_inffus_2023_101828 |
Cites_doi | 10.1007/978-981-13-1702-6_36 10.12720/joig.2.2.151-157 10.1109/LSP.2020.3023349 10.1109/ICME.2019.00042 10.1016/j.patcog.2018.08.007 10.1109/TIP.2020.2968250 10.1109/TMM.2019.2924578 10.1109/TCSVT.2020.3014663 10.1609/aaai.v34i07.6633 10.1109/CVPR.2019.00326 10.1109/CVPR.2018.00326 10.1007/978-3-030-58520-4_39 10.1109/IROS.2017.8206396 10.24963/ijcai.2018/95 10.1016/j.sigpro.2020.107766 10.1007/978-3-030-58542-6_39 10.1109/CVPR42600.2020.01304 10.24963/ijcai.2018/97 10.1109/CVPR.2011.5995344 10.1109/CVPR42600.2020.00353 10.1109/MIS.2020.2999462 10.1109/CVPR42600.2020.00943 10.1109/TMM.2020.3025166 10.5555/2999134.2999257 10.1109/TCDS.2021.3051010 10.1109/TMM.2021.3077767 10.1109/ICCV.2017.487 10.1609/aaai.v34i07.6892 10.1109/CVPR42600.2020.00312 10.1109/TIP.2020.3014734 10.1109/TCI.2020.2993640 10.1109/CVPR42600.2020.00908 10.1109/TPAMI.2012.98 10.1109/ISCID.2017.92 10.1007/978-3-030-58595-2_15 10.1109/TCSVT.2020.2995220 10.1007/978-3-030-58610-2_17 10.1109/CVPR.2019.00403 10.1109/TIP.2019.2959253 10.1007/978-3-030-01240-3_15 10.1109/TSMC.2019.2957386 10.1109/ACCESS.2019.2913107 10.1007/978-3-030-58604-1_23 10.1109/TNNLS.2020.2996406 10.1109/TIP.2019.2891104 10.1109/TIP.2020.2976689 10.1109/TCSVT.2019.2951621 10.1109/CVPR.2019.00766 10.1109/TIP.2009.2030969 10.1109/CVPR42600.2020.01377 10.1109/CVPR.2016.90 10.1007/978-3-030-58536-5_3 10.1109/CVPR42600.2020.00861 10.1109/CVPR.2009.5206596 10.1007/978-3-319-24574-4_28 10.1109/TIP.2020.3017352 10.1007/978-3-030-58598-3_31 10.1109/MIPR.2019.00032 10.1109/TPAMI.2016.2562626 10.1109/TCSVT.2018.2821177 10.1016/j.patcog.2020.107303 10.1109/TIP.2017.2669878 10.1007/978-3-030-58598-3_14 10.1109/TBME.2018.2877577 |
ContentType | Journal Article |
Copyright | Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2022 |
Copyright_xml | – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2022 |
DBID | 97E RIA RIE AAYXX CITATION 7SC 7SP 8FD JQ2 L7M L~C L~D |
DOI | 10.1109/TCSVT.2021.3077058 |
DatabaseName | IEEE Xplore (IEEE) IEEE All-Society Periodicals Package (ASPP) 1998–Present IEEE Xplore CrossRef Computer and Information Systems Abstracts Electronics & Communications Abstracts Technology Research Database ProQuest Computer Science Collection Advanced Technologies Database with Aerospace Computer and Information Systems Abstracts Academic Computer and Information Systems Abstracts Professional |
DatabaseTitle | CrossRef Technology Research Database Computer and Information Systems Abstracts – Academic Electronics & Communications Abstracts ProQuest Computer Science Collection Computer and Information Systems Abstracts Advanced Technologies Database with Aerospace Computer and Information Systems Abstracts Professional |
DatabaseTitleList | Technology Research Database |
Database_xml | – sequence: 1 dbid: RIE name: IEEE Electronic Library (IEL) url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/ sourceTypes: Publisher |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Engineering |
EISSN | 1558-2205 |
EndPage | 1235 |
ExternalDocumentID | 10_1109_TCSVT_2021_3077058 9420662 |
Genre | orig-research |
GrantInformation_xml | – fundername: Zhejiang Provincial Natural Science Foundation of China grantid: LY18F020012 funderid: 10.13039/501100004731 – fundername: National Natural Science Foundation of China grantid: 61502429; 61972357 funderid: 10.13039/501100001809 |
GroupedDBID | -~X 0R~ 29I 4.4 5GY 5VS 6IK 97E AAJGR AARMG AASAJ AAWTH ABAZT ABQJQ ABVLG ACGFO ACGFS ACIWK AENEX AETIX AGQYO AGSQL AHBIQ AI. AIBXA AKJIK AKQYR ALLEH ALMA_UNASSIGNED_HOLDINGS ASUFR ATWAV BEFXN BFFAM BGNUA BKEBE BPEOZ CS3 DU5 EBS EJD HZ~ H~9 ICLAB IFIPE IFJZH IPLJI JAVBF LAI M43 O9- OCL P2P RIA RIE RNS RXW TAE TN5 VH1 AAYXX CITATION RIG 7SC 7SP 8FD JQ2 L7M L~C L~D |
ID | FETCH-LOGICAL-c295t-3e04f7e8969fe903de844c81ad1d53c104e2a4e13f5a05b818d7a911b800ce683 |
IEDL.DBID | RIE |
ISSN | 1051-8215 |
IngestDate | Mon Jun 30 06:54:57 EDT 2025 Thu Apr 24 23:07:31 EDT 2025 Tue Jul 01 00:41:15 EDT 2025 Wed Aug 27 02:49:21 EDT 2025 |
IsPeerReviewed | true |
IsScholarly | true |
Issue | 3 |
Language | English |
License | https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html https://doi.org/10.15223/policy-029 https://doi.org/10.15223/policy-037 |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-c295t-3e04f7e8969fe903de844c81ad1d53c104e2a4e13f5a05b818d7a911b800ce683 |
Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
ORCID | 0000-0002-3055-2493 0000-0002-8877-2421 |
PQID | 2637442831 |
PQPubID | 85433 |
PageCount | 12 |
ParticipantIDs | proquest_journals_2637442831 ieee_primary_9420662 crossref_citationtrail_10_1109_TCSVT_2021_3077058 crossref_primary_10_1109_TCSVT_2021_3077058 |
ProviderPackageCode | CITATION AAYXX |
PublicationCentury | 2000 |
PublicationDate | 2022-03-01 |
PublicationDateYYYYMMDD | 2022-03-01 |
PublicationDate_xml | – month: 03 year: 2022 text: 2022-03-01 day: 01 |
PublicationDecade | 2020 |
PublicationPlace | New York |
PublicationPlace_xml | – name: New York |
PublicationTitle | IEEE transactions on circuits and systems for video technology |
PublicationTitleAbbrev | TCSVT |
PublicationYear | 2022 |
Publisher | IEEE The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Publisher_xml | – name: IEEE – name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
References | ref13 ref57 ref12 ref15 ref59 ref14 ref53 ref52 ref11 ref55 ref10 ref17 ref16 ref19 ref18 ref51 ref50 Wang (ref66) 2015 ref46 ref45 ref48 Chen (ref58) 2017 ref42 ref41 ref44 ref43 ref49 ref8 ref7 ref9 ref4 ref3 ref6 ref5 ref40 Tu (ref54) 2020 ref35 ref34 ref37 ref36 ref31 ref30 ref33 ref32 ref2 ref1 ref39 ref38 Zhang (ref20) 2020 ref70 ref24 ref68 ref23 ref67 ref26 Chen (ref47) 2020 ref25 ref69 ref64 ref63 ref22 ref21 ref65 ref28 ref27 Tu (ref56) 2020 ref29 ref60 ref62 ref61 |
References_xml | – ident: ref50 doi: 10.1007/978-981-13-1702-6_36 – ident: ref1 doi: 10.12720/joig.2.2.151-157 – ident: ref29 doi: 10.1109/LSP.2020.3023349 – ident: ref22 doi: 10.1109/ICME.2019.00042 – ident: ref25 doi: 10.1016/j.patcog.2018.08.007 – ident: ref40 doi: 10.1109/TIP.2020.2968250 – year: 2020 ident: ref47 article-title: DPANet: Depth potentiality-aware gated attention network for RGB-D salient object detection publication-title: arXiv:2003.08608 – ident: ref52 doi: 10.1109/TMM.2019.2924578 – ident: ref57 doi: 10.1109/TCSVT.2020.3014663 – ident: ref12 doi: 10.1609/aaai.v34i07.6633 – ident: ref4 doi: 10.1109/CVPR.2019.00326 – ident: ref7 doi: 10.1109/CVPR.2018.00326 – ident: ref38 doi: 10.1007/978-3-030-58520-4_39 – ident: ref49 doi: 10.1109/IROS.2017.8206396 – ident: ref9 doi: 10.24963/ijcai.2018/95 – ident: ref36 doi: 10.1016/j.sigpro.2020.107766 – year: 2017 ident: ref58 article-title: Rethinking atrous convolution for semantic image segmentation publication-title: arXiv:1706.05587 – ident: ref41 doi: 10.1007/978-3-030-58542-6_39 – ident: ref14 doi: 10.1109/CVPR42600.2020.01304 – year: 2020 ident: ref56 article-title: RGBT salient object detection: A large-scale dataset and benchmark publication-title: arXiv:2007.03262 – year: 2015 ident: ref66 article-title: Training deeper convolutional networks with deep supervision publication-title: arXiv:1505.02496 – ident: ref69 doi: 10.24963/ijcai.2018/97 – ident: ref59 doi: 10.1109/CVPR.2011.5995344 – ident: ref33 doi: 10.1109/CVPR42600.2020.00353 – ident: ref21 doi: 10.1109/MIS.2020.2999462 – ident: ref13 doi: 10.1109/CVPR42600.2020.00943 – ident: ref28 doi: 10.1109/TMM.2020.3025166 – ident: ref70 doi: 10.5555/2999134.2999257 – ident: ref16 doi: 10.1109/TCDS.2021.3051010 – ident: ref46 doi: 10.1109/TMM.2021.3077767 – ident: ref67 doi: 10.1109/ICCV.2017.487 – ident: ref11 doi: 10.1609/aaai.v34i07.6892 – year: 2020 ident: ref54 article-title: Multi-interactive siamese decoder for RGBT salient object detection publication-title: arXiv:2005.02315 – ident: ref30 doi: 10.1109/CVPR42600.2020.00312 – ident: ref43 doi: 10.1109/TIP.2020.3014734 – ident: ref3 doi: 10.1109/TCI.2020.2993640 – ident: ref31 doi: 10.1109/CVPR42600.2020.00908 – year: 2020 ident: ref20 article-title: CoADNet: Collaborative aggregation-and-distribution networks for co-salient object detection publication-title: arXiv:2011.04887 – ident: ref2 doi: 10.1109/TPAMI.2012.98 – ident: ref48 doi: 10.1109/ISCID.2017.92 – ident: ref42 doi: 10.1007/978-3-030-58595-2_15 – ident: ref15 doi: 10.1109/TCSVT.2020.2995220 – ident: ref34 doi: 10.1007/978-3-030-58610-2_17 – ident: ref8 doi: 10.1109/CVPR.2019.00403 – ident: ref55 doi: 10.1109/TIP.2019.2959253 – ident: ref63 doi: 10.1007/978-3-030-01240-3_15 – ident: ref27 doi: 10.1109/TSMC.2019.2957386 – ident: ref23 doi: 10.1109/ACCESS.2019.2913107 – ident: ref44 doi: 10.1007/978-3-030-58604-1_23 – ident: ref45 doi: 10.1109/TNNLS.2020.2996406 – ident: ref24 doi: 10.1109/TIP.2019.2891104 – ident: ref26 doi: 10.1109/TIP.2020.2976689 – ident: ref53 doi: 10.1109/TCSVT.2019.2951621 – ident: ref10 doi: 10.1109/CVPR.2019.00766 – ident: ref5 doi: 10.1109/TIP.2009.2030969 – ident: ref35 doi: 10.1109/CVPR42600.2020.01377 – ident: ref62 doi: 10.1109/CVPR.2016.90 – ident: ref17 doi: 10.1007/978-3-030-58536-5_3 – ident: ref32 doi: 10.1109/CVPR42600.2020.00861 – ident: ref68 doi: 10.1109/CVPR.2009.5206596 – ident: ref6 doi: 10.1007/978-3-319-24574-4_28 – ident: ref18 doi: 10.1109/TIP.2020.3017352 – ident: ref37 doi: 10.1007/978-3-030-58598-3_31 – ident: ref51 doi: 10.1109/MIPR.2019.00032 – ident: ref61 doi: 10.1109/TPAMI.2016.2562626 – ident: ref65 doi: 10.1109/TCSVT.2018.2821177 – ident: ref19 doi: 10.1016/j.patcog.2020.107303 – ident: ref60 doi: 10.1109/TIP.2017.2669878 – ident: ref39 doi: 10.1007/978-3-030-58598-3_14 – ident: ref64 doi: 10.1109/TBME.2018.2877577 |
SSID | ssj0014847 |
Score | 2.6906435 |
Snippet | Under ideal environmental conditions, RGB-based deep convolutional neural networks can achieve high performance for salient object detection (SOD). In scenes... |
SourceID | proquest crossref ieee |
SourceType | Aggregation Database Enrichment Source Index Database Publisher |
StartPage | 1224 |
SubjectTerms | Artificial neural networks bilateral reversal fusion module Color imagery cross-modality fusion Decoding Feature extraction Imaging Lighting Meteorology Modules multilevel consistent fusion module Object recognition RGB-T data Salience salient object detection Sorting Streaming media Weather |
Title | ECFFNet: Effective and Consistent Feature Fusion Network for RGB-T Salient Object Detection |
URI | https://ieeexplore.ieee.org/document/9420662 https://www.proquest.com/docview/2637442831 |
Volume | 32 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV3PT9swFLY6TnDYgDKto5t84DZc8sNJbG7QkVVI7STaTkg7RLH9cmEKE0su--t5z0mrapvQblFkR5Y_573P9nvfY-zMZETybSbKyqZCVmUstC0zYSLjoiqwqYso33m-SGdreXuf3A_Y-TYXBgB88BlM6NHf5btH29JR2YWWJD6OBvcVbty6XK3tjYFUvpgY0oVQKPRjmwSZQF-spstvK9wKRuEEV3QWUHn3HSfkq6r8ZYq9f8nfsPlmZF1YycOkbczE_v5DtPF_h37IXvdEk191K-OIDaA-Zgc78oND9v1mmucLaC55p2GMho-XteO-iCeiXzecGGL7BDxv6VSNL7qgcY5Ml999uRYrvkQeTw2_GjrQ4Z-h8bFd9Qlb5zer6Uz0xRaEjXTSiBgCWWWgdKor0EHsQElpVVi60CWxRUAhKiWEcZWUQWLQz7usREtpkHFaSFX8lu3VjzW8Y1xZY-MUtKoCJSElhXt8pfHjzhFDHLFwM_uF7ZXIqSDGj8LvSAJdeMQKQqzoERuxT9s-PzsdjhdbDwmCbct-9kdsvAG56H_VX0WUxpkk2bnw_b97nbL9iHIefODZmO01Ty18QCbSmI9-CT4DzYbYsw |
linkProvider | IEEE |
linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1Nb9QwEB1V5QAc-CqoWwr4wA28zYeT2NxgaVigu0g0RZU4RLE9uYBS1CYXfj0zTnZVAULcosiOLD9n5tmeeQPw3BZM8l0hm9blUrVNKo1rCmkT65M2crlPON95tc6XZ-rDeXa-Ay-3uTCIGILPcM6P4S7fX7iBj8qOjGLxcTK4N8jvZ_GYrbW9M1A6lBMjwhBLTZ5skyITmaNqcfqlos1gEs9pTRcRF3i_5oZCXZU_jHHwMOVdWG3GNgaWfJsPvZ27n7_JNv7v4O_BnYlqitfj2rgPO9g9gNvXBAj34OvxoizX2L8So4oxmT7RdF6EMp6Ef9cL5ojDJYpy4HM1sR7DxgVxXfH53RtZiVNi8tzwk-UjHfEW-xDd1T2Es_K4WizlVG5BusRkvUwxUm2B2uSmRROlHrVSTseNj32WOoIUk0ZhnLZZE2WWPL0vGrKVljinw1ynj2C3u-hwH4R21qU5Gt1GWmHOGvf0ytDHvWeOOIN4M_u1m7TIuSTG9zrsSSJTB8RqRqyeEJvBi22fH6MSxz9b7zEE25bT7M_gcANyPf2sV3WSp4Vi4bn44O-9nsHNZbU6qU_erz8-hlsJZ0CEMLRD2O0vB3xCvKS3T8Ny_AVIG9v8 |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=ECFFNet%3A+Effective+and+Consistent+Feature+Fusion+Network+for+RGB-T+Salient+Object+Detection&rft.jtitle=IEEE+transactions+on+circuits+and+systems+for+video+technology&rft.au=Zhou%2C+Wujie&rft.au=Guo%2C+Qinling&rft.au=Jingsheng+Lei&rft.au=Lu%2C+Yu&rft.date=2022-03-01&rft.pub=The+Institute+of+Electrical+and+Electronics+Engineers%2C+Inc.+%28IEEE%29&rft.issn=1051-8215&rft.eissn=1558-2205&rft.volume=32&rft.issue=3&rft.spage=1224&rft_id=info:doi/10.1109%2FTCSVT.2021.3077058&rft.externalDBID=NO_FULL_TEXT |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1051-8215&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1051-8215&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1051-8215&client=summon |