MobileSal: Extremely Efficient RGB-D Salient Object Detection
The high computational cost of neural networks has prevented recent successes in RGB-D salient object detection (SOD) from benefiting real-world applications. Hence, this article introduces a novel network, MobileSal, which focuses on efficient RGB-D SOD using mobile networks for deep feature extrac...
Saved in:
Published in | IEEE transactions on pattern analysis and machine intelligence Vol. 44; no. 12; pp. 10261 - 10269 |
---|---|
Main Authors | , , , , , |
Format | Journal Article |
Language | English |
Published |
New York
IEEE
01.12.2022
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Abstract | The high computational cost of neural networks has prevented recent successes in RGB-D salient object detection (SOD) from benefiting real-world applications. Hence, this article introduces a novel network, MobileSal, which focuses on efficient RGB-D SOD using mobile networks for deep feature extraction. However, mobile networks are less powerful in feature representation than cumbersome networks. To this end, we observe that the depth information of color images can strengthen the feature representation related to SOD if leveraged properly. Therefore, we propose an implicit depth restoration (IDR) technique to strengthen the mobile networks' feature representation capability for RGB-D SOD. IDR is only adopted in the training phase and is omitted during testing, so it is computationally free. Besides, we propose compact pyramid refinement (CPR) for efficient multi-level feature aggregation to derive salient objects with clear boundaries. With IDR and CPR incorporated, MobileSal performs favorably against state-of-the-art methods on six challenging RGB-D SOD datasets with much faster speed (450fps for the input size of <inline-formula><tex-math notation="LaTeX">320\times 320</tex-math> <mml:math><mml:mrow><mml:mn>320</mml:mn><mml:mo>×</mml:mo><mml:mn>320</mml:mn></mml:mrow></mml:math><inline-graphic xlink:href="cheng-ieq1-3134684.gif"/> </inline-formula>) and fewer parameters (6.5M). The code is released at https://mmcheng.net/mobilesal . |
---|---|
AbstractList | The high computational cost of neural networks has prevented recent successes in RGB-D salient object detection (SOD) from benefiting real-world applications. Hence, this article introduces a novel network, MobileSal, which focuses on efficient RGB-D SOD using mobile networks for deep feature extraction. However, mobile networks are less powerful in feature representation than cumbersome networks. To this end, we observe that the depth information of color images can strengthen the feature representation related to SOD if leveraged properly. Therefore, we propose an implicit depth restoration (IDR) technique to strengthen the mobile networks’ feature representation capability for RGB-D SOD. IDR is only adopted in the training phase and is omitted during testing, so it is computationally free. Besides, we propose compact pyramid refinement (CPR) for efficient multi-level feature aggregation to derive salient objects with clear boundaries. With IDR and CPR incorporated, MobileSal performs favorably against state-of-the-art methods on six challenging RGB-D SOD datasets with much faster speed (450fps for the input size of [Formula Omitted]) and fewer parameters (6.5M). The code is released at https://mmcheng.net/mobilesal . The high computational cost of neural networks has prevented recent successes in RGB-D salient object detection (SOD) from benefiting real-world applications. Hence, this article introduces a novel network, MobileSal, which focuses on efficient RGB-D SOD using mobile networks for deep feature extraction. However, mobile networks are less powerful in feature representation than cumbersome networks. To this end, we observe that the depth information of color images can strengthen the feature representation related to SOD if leveraged properly. Therefore, we propose an implicit depth restoration (IDR) technique to strengthen the mobile networks' feature representation capability for RGB-D SOD. IDR is only adopted in the training phase and is omitted during testing, so it is computationally free. Besides, we propose compact pyramid refinement (CPR) for efficient multi-level feature aggregation to derive salient objects with clear boundaries. With IDR and CPR incorporated, MobileSal performs favorably against state-of-the-art methods on six challenging RGB-D SOD datasets with much faster speed (450fps for the input size of <inline-formula><tex-math notation="LaTeX">320\times 320</tex-math> <mml:math><mml:mrow><mml:mn>320</mml:mn><mml:mo>×</mml:mo><mml:mn>320</mml:mn></mml:mrow></mml:math><inline-graphic xlink:href="cheng-ieq1-3134684.gif"/> </inline-formula>) and fewer parameters (6.5M). The code is released at https://mmcheng.net/mobilesal . |
Author | Gu, Yu-Chao Wu, Yu-Huan Cheng, Ming-Ming Liu, Yun Xu, Jun Bian, Jia-Wang |
Author_xml | – sequence: 1 givenname: Yu-Huan orcidid: 0000-0001-8666-3435 surname: Wu fullname: Wu, Yu-Huan email: wuyuhuan@mail.nankai.edu.cn – sequence: 2 givenname: Yun orcidid: 0000-0001-6143-0264 surname: Liu fullname: Liu, Yun email: yun.liu@vision.ee.ethz.ch – sequence: 3 givenname: Jun surname: Xu fullname: Xu, Jun email: nankaimathjunxu@gmail.com – sequence: 4 givenname: Jia-Wang orcidid: 0000-0003-2046-3363 surname: Bian fullname: Bian, Jia-Wang email: jiawang.bian@gmail.com – sequence: 5 givenname: Yu-Chao surname: Gu fullname: Gu, Yu-Chao email: ycgu@mail.nankai.edu.cn – sequence: 6 givenname: Ming-Ming orcidid: 0000-0001-5550-8758 surname: Cheng fullname: Cheng, Ming-Ming email: cmm@nankai.edu.cn |
BookMark | eNpdkE1Lw0AQhhep2A_9A3oJePGSOvuR7K7gQW2thZaK1vOy2UwgJU1qNgX7700_8ODpZZjnHYanTzplVSIh1xSGlIK-X74_zadDBowOOeUiVuKM9BiNIdRMsw7pAY1ZqBRTXdL3fgVARQT8gnS5UFoJDj3yOK-SvMBPWzwE45-mxjUWu2CcZbnLsWyCj8lzOAra9WFaJCt0TTDCpo28Ki_JeWYLj1enHJCv1_Hy5S2cLSbTl6dZ6DhTTRilXGYqo5HmNoI04U5a2f6maOJAMAeQJDLjqGIU1KVMtYzW0tooS1WsUz4gd8e7m7r63qJvzDr3DovCllhtvWExBVAiErpFb_-hq2pbl-13hkkuuIokhZZiR8rVlfc1ZmZT52tb7wwFs5drDnLNXq45yW1LN8dSjoh_BR0LqSPBfwGGR3NA |
CODEN | ITPIDJ |
CitedBy_id | crossref_primary_10_1109_TMM_2022_3187856 crossref_primary_10_1016_j_patcog_2024_110693 crossref_primary_10_1007_s11263_024_02058_y crossref_primary_10_1080_01431161_2023_2288947 crossref_primary_10_1007_s42835_024_01971_z crossref_primary_10_1145_3656476 crossref_primary_10_3390_rs14246297 crossref_primary_10_1016_j_jksuci_2023_101702 crossref_primary_10_3390_sym14050887 crossref_primary_10_3934_era_2024031 crossref_primary_10_3390_rs15092393 crossref_primary_10_1016_j_measurement_2023_113180 crossref_primary_10_1016_j_knosys_2024_112126 crossref_primary_10_1109_JSEN_2023_3333322 crossref_primary_10_1049_ipr2_12862 crossref_primary_10_1117_1_JEI_33_2_023036 crossref_primary_10_3390_rs15102680 crossref_primary_10_1109_TCSVT_2023_3295588 crossref_primary_10_1016_j_patcog_2024_110304 crossref_primary_10_1109_TIP_2022_3205747 crossref_primary_10_1016_j_neunet_2023_11_051 crossref_primary_10_1109_LSP_2024_3356416 crossref_primary_10_1007_s11042_022_12799_y crossref_primary_10_1007_s12559_023_10148_1 crossref_primary_10_1145_3624747 crossref_primary_10_1109_TGRS_2023_3235717 crossref_primary_10_1145_3624984 crossref_primary_10_3390_s24041117 crossref_primary_10_1109_TITS_2024_3387949 crossref_primary_10_1049_ipr2_12796 crossref_primary_10_1016_j_jvcir_2023_103880 crossref_primary_10_1007_s00530_023_01250_3 crossref_primary_10_1109_TCSVT_2022_3180274 crossref_primary_10_1109_TIP_2022_3214092 crossref_primary_10_1145_3674836 crossref_primary_10_1016_j_patcog_2023_110190 crossref_primary_10_1007_s11042_023_14421_1 crossref_primary_10_1016_j_knosys_2023_110322 crossref_primary_10_1109_TCSVT_2022_3184840 crossref_primary_10_1109_TIP_2022_3164550 crossref_primary_10_3390_app122312432 crossref_primary_10_1007_s00371_024_03360_z crossref_primary_10_1016_j_jvcir_2023_103951 crossref_primary_10_1109_TIP_2023_3315511 crossref_primary_10_1016_j_optlaseng_2023_107842 crossref_primary_10_1016_j_imavis_2024_105048 crossref_primary_10_1109_TII_2023_3336349 crossref_primary_10_1109_TCSVT_2023_3285249 crossref_primary_10_3390_electronics11131968 |
Cites_doi | 10.1109/CVPR.2018.00716 10.1109/TIP.2003.819861 10.1109/CVPR.2019.00766 10.1109/TMM.2021.3069297 10.1109/CVPR.2013.407 10.1109/TIP.2021.3049959 10.1109/3DV.2016.79 10.1109/TIP.2019.2893535 10.1007/s41095-019-0149-9 10.1109/CVPR.2019.00405 10.1109/TPAMI.2021.3051099 10.1109/ICCV.2017.487 10.1109/ICCV.2013.370 10.1109/TPAMI.2021.3073689 10.1109/CVPR.2019.00293 10.1145/2632856.2632866 10.1109/CVPR.2018.00322 10.1007/978-3-030-01264-9_8 10.1109/TIP.2017.2766787 10.1109/TPAMI.2018.2815688 10.1007/s41095-020-0199-z 10.1109/TCYB.2017.2761775 10.1109/ACCESS.2019.2913107 10.1109/TIP.2021.3058783 10.1109/CVPR42600.2020.01377 10.1109/CVPR.2019.00404 10.1109/ICIP.2014.7025222 10.1109/TPAMI.2014.2345401 10.1109/CVPR46437.2021.00935 10.1109/ICCV.2019.00735 10.1109/CVPR.2009.5206596 10.1109/TPAMI.2020.3023152 10.1007/978-3-030-58598-3_31 10.1109/TNNLS.2020.2996406 10.1109/ICCV.2013.193 10.1360/SSI-2020-0370 10.1109/CVPR.2019.00834 10.5244/C.27.98 10.1109/TPAMI.2021.3073564 10.1109/CVPR.2018.00474 10.1007/978-3-030-58536-5_3 10.1109/ICDSP.2014.6900706 10.1007/s11263-016-0977-3 10.1109/TPAMI.2018.2840724 10.1109/TIP.2019.2891104 10.5244/C.27.112 10.1007/978-3-030-58621-8_33 10.1109/CVPR42600.2020.00943 10.1109/CVPR.2019.00941 10.1007/978-3-319-10578-9_7 10.1109/TIP.2021.3049332 10.1007/978-3-642-33715-4_54 10.1609/aaai.v35i2.16191 10.1007/978-3-030-58595-2_15 10.1007/978-3-642-33709-3_8 10.1109/MSP.2017.2749125 10.1109/LSP.2016.2557347 10.1109/CVPR.2016.90 10.1109/CVPR42600.2020.01079 10.1109/ICCV.2017.31 10.1109/TCYB.2020.3035613 10.1109/TPAMI.2021.3053577 10.1007/s11432-020-3097-4 10.1109/TIP.2021.3065239 10.1007/978-3-030-58598-3_14 10.1109/TIP.2021.3065822 10.1109/TMM.2020.3011327 10.1007/978-3-030-58542-6_39 |
ContentType | Journal Article |
Copyright | Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2022 |
Copyright_xml | – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2022 |
DBID | 97E RIA RIE AAYXX CITATION 7SC 7SP 8FD JQ2 L7M L~C L~D 7X8 |
DOI | 10.1109/TPAMI.2021.3134684 |
DatabaseName | IEEE All-Society Periodicals Package (ASPP) 2005–Present IEEE All-Society Periodicals Package (ASPP) Online IEEE CrossRef Computer and Information Systems Abstracts Electronics & Communications Abstracts Technology Research Database ProQuest Computer Science Collection Advanced Technologies Database with Aerospace Computer and Information Systems Abstracts Academic Computer and Information Systems Abstracts Professional MEDLINE - Academic |
DatabaseTitle | CrossRef Technology Research Database Computer and Information Systems Abstracts – Academic Electronics & Communications Abstracts ProQuest Computer Science Collection Computer and Information Systems Abstracts Advanced Technologies Database with Aerospace Computer and Information Systems Abstracts Professional MEDLINE - Academic |
DatabaseTitleList | Technology Research Database |
Database_xml | – sequence: 1 dbid: RIE name: IEEE Electronic Library Online url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/ sourceTypes: Publisher |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Engineering Computer Science |
EISSN | 2160-9292 1939-3539 |
EndPage | 10269 |
ExternalDocumentID | 10_1109_TPAMI_2021_3134684 9647954 |
Genre | orig-research |
GrantInformation_xml | – fundername: National Natural Science Foundation of China; NSFC grantid: 61922046 funderid: 10.13039/501100001809 – fundername: National Key Research and Development Program of China grantid: 2018AAA0100400 |
GroupedDBID | --- -DZ -~X .DC 0R~ 29I 4.4 53G 5GY 6IK 97E AAJGR AASAJ ABQJQ ACGFO ACGFS ACIWK ACNCT AENEX AKJIK ALMA_UNASSIGNED_HOLDINGS ASUFR ATWAV BEFXN BFFAM BGNUA BKEBE BPEOZ CS3 DU5 E.L EBS EJD F5P HZ~ IEDLZ IFIPE IPLJI JAVBF LAI M43 MS~ O9- OCL P2P PQQKQ RIA RIC RIE RIG RNS RXW TAE TN5 UHB ~02 AAYXX CITATION 7SC 7SP 8FD JQ2 L7M L~C L~D 7X8 |
ID | FETCH-LOGICAL-c328t-5d37f8f1593a50db3c7a716281bc042c00bb7f3e86e41cd280db997aa5fd869d3 |
IEDL.DBID | RIE |
ISSN | 0162-8828 |
IngestDate | Wed Dec 04 04:41:14 EST 2024 Thu Oct 10 16:29:53 EDT 2024 Fri Dec 06 03:06:19 EST 2024 Mon Nov 04 11:49:54 EST 2024 |
IsPeerReviewed | true |
IsScholarly | true |
Issue | 12 |
Language | English |
License | https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html https://doi.org/10.15223/policy-029 https://doi.org/10.15223/policy-037 |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-c328t-5d37f8f1593a50db3c7a716281bc042c00bb7f3e86e41cd280db997aa5fd869d3 |
Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 23 |
ORCID | 0000-0001-6143-0264 0000-0003-2046-3363 0000-0001-8666-3435 0000-0001-5550-8758 |
PMID | 34898430 |
PQID | 2734385710 |
PQPubID | 85458 |
PageCount | 9 |
ParticipantIDs | proquest_miscellaneous_2610084549 crossref_primary_10_1109_TPAMI_2021_3134684 ieee_primary_9647954 proquest_journals_2734385710 |
PublicationCentury | 2000 |
PublicationDate | 2022-12-01 |
PublicationDateYYYYMMDD | 2022-12-01 |
PublicationDate_xml | – month: 12 year: 2022 text: 2022-12-01 day: 01 |
PublicationDecade | 2020 |
PublicationPlace | New York |
PublicationPlace_xml | – name: New York |
PublicationTitle | IEEE transactions on pattern analysis and machine intelligence |
PublicationTitleAbbrev | TPAMI |
PublicationYear | 2022 |
Publisher | IEEE The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Publisher_xml | – name: IEEE – name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
References | ref57 ref13 ref56 ref12 ioffe (ref61) 2015 ref59 ref15 ref58 ref14 ref53 ref52 ref55 ref11 ref54 ref10 ref17 ref19 ref18 nair (ref62) 2010 ref51 ref50 ref46 ref45 ref48 ref42 ref41 ref44 ref43 niu (ref72) 2012 ref49 ref8 ref7 ref9 ref4 ref3 ref6 ref5 ref40 ref79 ref35 ref78 ref34 ref37 ref36 ref75 ref31 ref74 ref30 ref77 ref33 zhao (ref47) 2021 ref76 ref32 ref2 ref39 ref38 hong (ref1) 2015 ref70 ref68 simonyan (ref20) 2015 ref24 ref67 ref23 ref26 ref25 ref64 ref63 ref66 ref22 ref65 ref21 ref27 ref29 zhu (ref73) 2017 howard (ref16) 2017 kingma (ref71) 2015 ref60 wu (ref28) 2021 paszke (ref69) 2019 |
References_xml | – year: 2017 ident: ref16 article-title: MobileNets: Efficient convolutional neural networks for mobile vision applications contributor: fullname: howard – ident: ref18 doi: 10.1109/CVPR.2018.00716 – ident: ref64 doi: 10.1109/TIP.2003.819861 – ident: ref24 doi: 10.1109/CVPR.2019.00766 – ident: ref55 doi: 10.1109/TMM.2021.3069297 – ident: ref29 doi: 10.1109/CVPR.2013.407 – ident: ref68 doi: 10.1109/TIP.2021.3049959 – ident: ref65 doi: 10.1109/3DV.2016.79 – ident: ref26 doi: 10.1109/TIP.2019.2893535 – ident: ref38 doi: 10.1007/s41095-019-0149-9 – ident: ref8 doi: 10.1109/CVPR.2019.00405 – start-page: 3008 year: 2017 ident: ref73 article-title: A three-pathway psychobiological framework of salient object detection using stereoscopic technology publication-title: Proc Int Conf Comput Vis Workshop contributor: fullname: zhu – ident: ref39 doi: 10.1109/TPAMI.2021.3051099 – ident: ref75 doi: 10.1109/ICCV.2017.487 – ident: ref33 doi: 10.1109/ICCV.2013.370 – ident: ref14 doi: 10.1109/TPAMI.2021.3073689 – ident: ref56 doi: 10.1109/CVPR.2019.00293 – ident: ref45 doi: 10.1145/2632856.2632866 – ident: ref67 doi: 10.1109/CVPR.2018.00322 – ident: ref19 doi: 10.1007/978-3-030-01264-9_8 – ident: ref34 doi: 10.1109/TIP.2017.2766787 – ident: ref23 doi: 10.1109/TPAMI.2018.2815688 – start-page: 454 year: 2012 ident: ref72 article-title: Leveraging stereopsis for saliency analysis publication-title: Proc IEEE Conf Comput Vis Pattern Recognit contributor: fullname: niu – ident: ref52 doi: 10.1007/s41095-020-0199-z – ident: ref53 doi: 10.1109/TCYB.2017.2761775 – ident: ref54 doi: 10.1109/ACCESS.2019.2913107 – ident: ref63 doi: 10.1109/TIP.2021.3058783 – ident: ref22 doi: 10.1109/CVPR42600.2020.01377 – ident: ref25 doi: 10.1109/CVPR.2019.00404 – ident: ref15 doi: 10.1109/ICIP.2014.7025222 – ident: ref31 doi: 10.1109/TPAMI.2014.2345401 – start-page: 448 year: 2015 ident: ref61 article-title: Batch normalization: Accelerating deep network training by reducing internal covariate shift publication-title: Proc Int Conf Mach Learn contributor: fullname: ioffe – ident: ref51 doi: 10.1109/CVPR46437.2021.00935 – ident: ref9 doi: 10.1109/ICCV.2019.00735 – ident: ref74 doi: 10.1109/CVPR.2009.5206596 – ident: ref3 doi: 10.1109/TPAMI.2020.3023152 – year: 2021 ident: ref28 article-title: EDN: Salient object detection via extremely-downsampled network contributor: fullname: wu – ident: ref48 doi: 10.1007/978-3-030-58598-3_31 – ident: ref10 doi: 10.1109/TNNLS.2020.2996406 – ident: ref30 doi: 10.1109/ICCV.2013.193 – ident: ref76 doi: 10.1360/SSI-2020-0370 – year: 2015 ident: ref20 article-title: Very deep convolutional networks for large-scale image recognition publication-title: Proc Int Conf Learn Representations contributor: fullname: simonyan – ident: ref35 doi: 10.1109/CVPR.2019.00834 – ident: ref42 doi: 10.5244/C.27.98 – start-page: 807 year: 2010 ident: ref62 article-title: Rectified linear units improve restricted Boltzmann machines publication-title: Proc Int Conf Mach Learn contributor: fullname: nair – ident: ref11 doi: 10.1109/TPAMI.2021.3073564 – ident: ref17 doi: 10.1109/CVPR.2018.00474 – ident: ref6 doi: 10.1007/978-3-030-58536-5_3 – ident: ref43 doi: 10.1109/ICDSP.2014.6900706 – ident: ref32 doi: 10.1007/s11263-016-0977-3 – ident: ref2 doi: 10.1109/TPAMI.2018.2840724 – ident: ref7 doi: 10.1109/TIP.2019.2891104 – ident: ref41 doi: 10.5244/C.27.112 – year: 2015 ident: ref71 article-title: Adam: A method for stochastic optimization publication-title: Proc Int Conf Learn Representations contributor: fullname: kingma – ident: ref78 doi: 10.1007/978-3-030-58621-8_33 – year: 2021 ident: ref47 article-title: Self-supervised representation learning for RGB-D salient object detection contributor: fullname: zhao – ident: ref5 doi: 10.1109/CVPR42600.2020.00943 – ident: ref57 doi: 10.1109/CVPR.2019.00941 – ident: ref44 doi: 10.1007/978-3-319-10578-9_7 – ident: ref77 doi: 10.1109/TIP.2021.3049332 – ident: ref79 doi: 10.1007/978-3-642-33715-4_54 – ident: ref50 doi: 10.1609/aaai.v35i2.16191 – ident: ref46 doi: 10.1007/978-3-030-58595-2_15 – ident: ref40 doi: 10.1007/978-3-642-33709-3_8 – ident: ref37 doi: 10.1109/MSP.2017.2749125 – ident: ref66 doi: 10.1109/LSP.2016.2557347 – ident: ref21 doi: 10.1109/CVPR.2016.90 – ident: ref58 doi: 10.1109/CVPR42600.2020.01079 – start-page: 8026 year: 2019 ident: ref69 article-title: PyTorch: An imperative style, high-performance deep learning library publication-title: Proc Annu Conf Neural Inf Process Syst contributor: fullname: paszke – start-page: 597 year: 2015 ident: ref1 article-title: Online tracking by learning discriminative saliency map with convolutional neural network publication-title: Proc Int Conf Mach Learn contributor: fullname: hong – ident: ref4 doi: 10.1109/ICCV.2017.31 – ident: ref59 doi: 10.1109/TCYB.2020.3035613 – ident: ref36 doi: 10.1109/TPAMI.2021.3053577 – ident: ref70 doi: 10.1007/s11432-020-3097-4 – ident: ref60 doi: 10.1109/TIP.2021.3065239 – ident: ref13 doi: 10.1007/978-3-030-58598-3_14 – ident: ref27 doi: 10.1109/TIP.2021.3065822 – ident: ref49 doi: 10.1109/TMM.2020.3011327 – ident: ref12 doi: 10.1007/978-3-030-58542-6_39 |
SSID | ssj0014503 |
Score | 2.6607528 |
Snippet | The high computational cost of neural networks has prevented recent successes in RGB-D salient object detection (SOD) from benefiting real-world applications.... |
SourceID | proquest crossref ieee |
SourceType | Aggregation Database Publisher |
StartPage | 10261 |
SubjectTerms | Color imagery Convolution efficiency Feature extraction Fuses Image restoration implicit depth restoration Neural networks Object detection Object recognition Representations RGB-D salient object detection Salience Semantics Streaming media |
Title | MobileSal: Extremely Efficient RGB-D Salient Object Detection |
URI | https://ieeexplore.ieee.org/document/9647954 https://www.proquest.com/docview/2734385710 https://search.proquest.com/docview/2610084549 |
Volume | 44 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1LS8QwEB50T3rwLa4vKnjTrmmTtKngYdX1BaviA7yVpE0vLl3RLqi_3plsu4h68JaS0DaZTPJNMvMNwK5QBQusKnyUrvIFxeVohqXc6EAViiupnbfFdXTxKK6e5NMU7E9iYay1zvnMdqjo7vLzYTaio7IDippMpJiG6TiJx7FakxsDIV0WZEQwqOFoRjQBMiw5eLjt9i_RFAwDtFC5iBQl4-FCJUqQ8_O3_cglWPm1Krut5mwe-s1Pjj1MnjujynSyzx_8jf_txQLM1ZjT644nySJM2XIJ5pt8Dl6t3ksw-42ccBmO-kODS8a9Hhx6vfeKzhEHH17PUU7gB7y782P_1MNq93Rj6EDHO7WV8-0qV-DxrPdwcuHXyRb8jIeq8mXO40IViG64liw3PIs1sUshrM1QsTPGjIkLblVkRZDlocI2SRJrLYtcRUnOV6FVDku7Bl4eyVgj7jEh0R0yqSJmsM8yICrnWJs27DVDnr6MOTVSZ4uwJHWySklWaS2rNizTGE5a1sPXhs1GSmmtdm8pcfXg9ELU1IadSTUqDN2C6NIOR9gmIj4jgXbx-t9v3oCZkGIcnM_KJrSq15HdQuRRmW035b4AhAPQaQ |
link.rule.ids | 315,781,785,797,27929,27930,54763 |
linkProvider | IEEE |
linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1RT9swED518LDxMKAdohuDTNrbSHFiO3Em7YHRQtnagqCV-hbZifMCStGWStt-_e7cpEJsD7w5spXEPp_9nX33HcBHoQoWWFX4KF3lC4rL0QxLudGBKhRXUjtvi0k0nIlvczlvwfE6FsZa65zPbI-K7i4_X2RLOio7oajJRIoXsCkF6sUqWmt9ZyCky4OMGAZ1HA2JJkSGJSfT69PxJRqDYYA2KheRonQ8XKhECXJ_frQjuRQr_6zLbrM534Zx85srH5O73rIyvezPEwbH5_ZjB17XqNM7XU2TXWjZsg3bTUYHr1bwNmw9oifswJfxwuCicavvP3uDXxWdJN7_9gaOdAI_4N1cfPX7Hla7pytDRzpe31bOu6t8A7PzwfRs6NfpFvyMh6ryZc7jQhWIb7iWLDc8izXxSyGwzVC1M8aMiQtuVWRFkOWhwjZJEmsti1xFSc73YKNclHYfvDySsUbkY0IiPGRSRcxgn2VAZM6xNl341Ax5-rBi1UidNcKS1MkqJVmltay60KExXLesh68LB42U0lrxfqbE1oMTDHFTFz6sq1Fl6B5El3axxDYRMRoJtIzf_v_NR_ByOB2P0tHl5Ps7eBVSxIPzYDmAjerH0r5HHFKZQzf9_gKfedO3 |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=MobileSal%3A+Extremely+Efficient+RGB-D+Salient+Object+Detection&rft.jtitle=IEEE+transactions+on+pattern+analysis+and+machine+intelligence&rft.au=Yu-Huan%2C+Wu&rft.au=Liu%2C+Yun&rft.au=Xu%2C+Jun&rft.au=Jia-Wang%2C+Bian&rft.date=2022-12-01&rft.pub=The+Institute+of+Electrical+and+Electronics+Engineers%2C+Inc.+%28IEEE%29&rft.issn=0162-8828&rft.eissn=1939-3539&rft.volume=44&rft.issue=12&rft.spage=10261&rft_id=info:doi/10.1109%2FTPAMI.2021.3134684&rft.externalDBID=NO_FULL_TEXT |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0162-8828&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0162-8828&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0162-8828&client=summon |