Multi-Scale Metric Learning for Few-Shot Learning
Few-shot learning in image classification is developed to learn a model that aims to identify unseen classes with only few training samples for each class. Fewer training samples and new tasks of classification make many traditional classification models no longer applicable. In this paper, a novel...
Saved in:
Published in | IEEE transactions on circuits and systems for video technology Vol. 31; no. 3; pp. 1091 - 1102 |
---|---|
Main Authors | , , , |
Format | Journal Article |
Language | English |
Published |
New York
IEEE
01.03.2021
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Abstract | Few-shot learning in image classification is developed to learn a model that aims to identify unseen classes with only few training samples for each class. Fewer training samples and new tasks of classification make many traditional classification models no longer applicable. In this paper, a novel few-shot learning method named multi-scale metric learning (MSML) is proposed to extract multi-scale features and learn the multi-scale relations between samples for the classification of few-shot learning. In the proposed method, a feature pyramid structure is introduced for multi-scale feature embedding, which aims to combine high-level strong semantic features with low-level but abundant visual features. Then a multi-scale relation generation network (MRGN) is developed for hierarchical metric learning, in which high-level features are corresponding to deeper metric learning while low-level features are corresponding to lighter metric learning. Moreover, a novel loss function named intra-class and inter-class relation loss (IIRL) is proposed to optimize the proposed deep network, which aims to strengthen the correlation between homogeneous groups of samples and weaken the correlation between heterogeneous groups of samples. Experimental results on mini ImageNet and tiered ImageNet demonstrate that the proposed method achieves superior performance in few-shot learning problem. |
---|---|
AbstractList | Few-shot learning in image classification is developed to learn a model that aims to identify unseen classes with only few training samples for each class. Fewer training samples and new tasks of classification make many traditional classification models no longer applicable. In this paper, a novel few-shot learning method named multi-scale metric learning (MSML) is proposed to extract multi-scale features and learn the multi-scale relations between samples for the classification of few-shot learning. In the proposed method, a feature pyramid structure is introduced for multi-scale feature embedding, which aims to combine high-level strong semantic features with low-level but abundant visual features. Then a multi-scale relation generation network (MRGN) is developed for hierarchical metric learning, in which high-level features are corresponding to deeper metric learning while low-level features are corresponding to lighter metric learning. Moreover, a novel loss function named intra-class and inter-class relation loss (IIRL) is proposed to optimize the proposed deep network, which aims to strengthen the correlation between homogeneous groups of samples and weaken the correlation between heterogeneous groups of samples. Experimental results on mini ImageNet and tiered ImageNet demonstrate that the proposed method achieves superior performance in few-shot learning problem. |
Author | Jiang, Wen Deng, Xinyang Geng, Jie Huang, Kai |
Author_xml | – sequence: 1 givenname: Wen orcidid: 0000-0001-5429-2748 surname: Jiang fullname: Jiang, Wen email: jiangwen@nwpu.edu.cn organization: School of Electronics and Information, Northwestern Polytechnical University, Xi'an, China – sequence: 2 givenname: Kai surname: Huang fullname: Huang, Kai organization: School of Electronics and Information, Northwestern Polytechnical University, Xi'an, China – sequence: 3 givenname: Jie orcidid: 0000-0003-4858-823X surname: Geng fullname: Geng, Jie organization: School of Electronics and Information, Northwestern Polytechnical University, Xi'an, China – sequence: 4 givenname: Xinyang orcidid: 0000-0001-8181-7001 surname: Deng fullname: Deng, Xinyang organization: School of Electronics and Information, Northwestern Polytechnical University, Xi'an, China |
BookMark | eNp9kMFOAjEQhhuDiYC-gF428Vxsux3aHg0RNIF4AL023TKrJesudkuMb-8ihIMHLzOTyf_NJN-A9OqmRkKuORtxzszdarJ8XY0EE2wkjAEF8oz0OYCmQjDodTMDTrXgcEEGbbthjEstVZ_wxa5KgS69qzBbYIrBZ3N0sQ71W1Y2MZviF12-N-m0vSTnpatavDr2IXmZPqwmj3T-PHua3M-pFwYSNZIr5CrPS2MUginXqLsqURrnS-HWxhQaWMELBhqhADcGXyjNnWA5FiwfktvD3W1sPnfYJrtpdrHuXlohjdZKaDXuUvqQ8rFp24il9SG5FJo6RRcqy5ndC7K_guxekD0K6lDxB93G8OHi9__QzQEKiHgCDDNKgMh_ANN6cgg |
CODEN | ITCTEM |
CitedBy_id | crossref_primary_10_1109_TCSVT_2021_3135023 crossref_primary_10_1007_s11042_023_14790_7 crossref_primary_10_1016_j_cose_2024_103749 crossref_primary_10_1016_j_engappai_2021_104584 crossref_primary_10_1016_j_ins_2022_05_012 crossref_primary_10_1007_s10489_024_05582_z crossref_primary_10_1007_s10489_021_03066_y crossref_primary_10_1016_j_patcog_2022_109024 crossref_primary_10_1109_TCSVT_2023_3291054 crossref_primary_10_1002_int_22273 crossref_primary_10_1109_TCSVT_2022_3173687 crossref_primary_10_1109_TCSVT_2023_3317937 crossref_primary_10_1109_TIM_2023_3289549 crossref_primary_10_1109_TCSVT_2024_3499937 crossref_primary_10_1109_TCSVT_2022_3193612 crossref_primary_10_1007_s11042_023_16160_9 crossref_primary_10_1109_TCSVT_2021_3076523 crossref_primary_10_1109_TIM_2023_3292952 crossref_primary_10_1109_TCSVT_2023_3288205 crossref_primary_10_1007_s10489_021_02827_z crossref_primary_10_1109_TCSVT_2024_3432152 crossref_primary_10_1109_TGRS_2023_3330490 crossref_primary_10_1016_j_engappai_2020_103882 crossref_primary_10_1109_TCSVT_2021_3052785 crossref_primary_10_1002_qre_2773 crossref_primary_10_1007_s00521_021_05707_2 crossref_primary_10_1016_j_isatra_2022_08_009 crossref_primary_10_1109_TGRS_2024_3362391 crossref_primary_10_1109_TGRS_2024_3368091 crossref_primary_10_1016_j_neucom_2022_08_070 crossref_primary_10_1109_TCSVT_2023_3275382 crossref_primary_10_1016_j_engappai_2022_105429 crossref_primary_10_1016_j_ins_2022_11_014 crossref_primary_10_1109_TGRS_2024_3354800 crossref_primary_10_3390_pr12040799 crossref_primary_10_3390_rs14143247 crossref_primary_10_1109_TCSVT_2021_3065693 crossref_primary_10_1016_j_egyai_2022_100170 crossref_primary_10_1109_TCSVT_2023_3248798 crossref_primary_10_3390_electronics12244934 crossref_primary_10_1016_j_ins_2023_119012 crossref_primary_10_1109_TIM_2022_3193185 crossref_primary_10_1109_TAI_2022_3212346 crossref_primary_10_1109_TCSVT_2021_3124908 crossref_primary_10_1109_TIM_2023_3322481 crossref_primary_10_1109_TITS_2024_3457583 crossref_primary_10_1007_s13369_020_05183_1 crossref_primary_10_1109_TII_2022_3169459 crossref_primary_10_1109_LGRS_2023_3243036 crossref_primary_10_1109_TGRS_2023_3337856 crossref_primary_10_1109_TGRS_2022_3210195 crossref_primary_10_1007_s10489_024_05670_0 crossref_primary_10_1017_S0263574724000195 crossref_primary_10_1016_j_asoc_2024_112558 crossref_primary_10_1016_j_neucom_2024_128044 crossref_primary_10_1109_TGRS_2023_3325988 crossref_primary_10_1007_s10489_024_05388_z crossref_primary_10_1016_j_neucom_2024_129012 crossref_primary_10_1109_TCSVT_2024_3435003 crossref_primary_10_1109_TCSVT_2023_3281151 crossref_primary_10_1109_TCSVT_2022_3175923 crossref_primary_10_1007_s10489_022_03347_0 crossref_primary_10_1109_TGRS_2023_3332051 crossref_primary_10_1007_s10489_024_05425_x crossref_primary_10_1016_j_knosys_2023_110746 crossref_primary_10_1016_j_eswa_2024_124676 crossref_primary_10_1109_TCSVT_2023_3301854 crossref_primary_10_1007_s10489_024_05340_1 crossref_primary_10_1016_j_ins_2023_119598 crossref_primary_10_1016_j_compositesa_2024_108671 crossref_primary_10_1109_TCSVT_2022_3165068 crossref_primary_10_1002_int_23070 crossref_primary_10_1007_s00371_024_03650_6 crossref_primary_10_1109_JSEN_2021_3117942 crossref_primary_10_1145_3656047 crossref_primary_10_1007_s12559_024_10273_5 crossref_primary_10_1109_TIP_2022_3169689 crossref_primary_10_3390_s23156880 crossref_primary_10_1109_TCSVT_2022_3223150 crossref_primary_10_1109_TIP_2022_3184813 crossref_primary_10_1109_TIP_2025_3526064 crossref_primary_10_1007_s00138_025_01669_w crossref_primary_10_1109_TFUZZ_2020_3033062 crossref_primary_10_1007_s40815_023_01467_4 crossref_primary_10_1109_TCSVT_2022_3197147 crossref_primary_10_1109_TCSVT_2023_3282777 crossref_primary_10_1016_j_ress_2023_109353 crossref_primary_10_3390_rs14215550 crossref_primary_10_1016_j_neucom_2024_128970 crossref_primary_10_1109_TAI_2022_3169463 crossref_primary_10_3390_sym12091435 crossref_primary_10_1109_TCSVT_2023_3263593 crossref_primary_10_1016_j_knosys_2021_107840 crossref_primary_10_1109_TCSVT_2023_3268997 crossref_primary_10_1109_TCSVT_2024_3480279 crossref_primary_10_1109_TCSVT_2023_3327377 crossref_primary_10_1109_TCSVT_2024_3432753 crossref_primary_10_3390_app13053213 crossref_primary_10_1016_j_engappai_2023_107412 crossref_primary_10_1016_j_patcog_2024_110855 crossref_primary_10_1109_TCSVT_2021_3088545 crossref_primary_10_1007_s10489_022_03959_6 crossref_primary_10_3389_fnins_2022_1081788 crossref_primary_10_1007_s10489_024_05440_y crossref_primary_10_1016_j_comcom_2023_10_022 crossref_primary_10_1109_TNNLS_2022_3213023 crossref_primary_10_11834_jig_230359 crossref_primary_10_1109_TCSVT_2023_3248585 crossref_primary_10_1109_TASE_2024_3369659 crossref_primary_10_1007_s10489_024_05635_3 crossref_primary_10_1007_s43684_023_00058_2 crossref_primary_10_1016_j_ins_2023_119855 crossref_primary_10_1109_TCSVT_2024_3435858 crossref_primary_10_1002_int_22366 crossref_primary_10_1016_j_neucom_2022_11_073 crossref_primary_10_1049_rpg2_13029 crossref_primary_10_1109_TCSVT_2024_3424566 crossref_primary_10_1007_s40815_021_01118_6 crossref_primary_10_1109_TAI_2021_3135248 crossref_primary_10_1109_TCSVT_2021_3125129 crossref_primary_10_1109_TCSVT_2024_3378978 crossref_primary_10_1007_s12559_021_09893_y crossref_primary_10_1155_2021_5559529 crossref_primary_10_1155_2022_8110695 crossref_primary_10_1016_j_jpowsour_2022_232389 crossref_primary_10_1109_TCSVT_2024_3402952 crossref_primary_10_1109_TCSVT_2022_3188462 crossref_primary_10_3389_fmars_2024_1481028 crossref_primary_10_1109_TCSVT_2022_3164190 crossref_primary_10_1109_TCSVT_2022_3227574 crossref_primary_10_1364_OE_457945 crossref_primary_10_3390_electronics12132757 crossref_primary_10_1007_s00371_023_03228_8 crossref_primary_10_1109_TCSVT_2023_3241651 crossref_primary_10_3390_rs13142728 crossref_primary_10_3390_biomimetics10010016 crossref_primary_10_1109_ACCESS_2025_3544578 crossref_primary_10_1109_TCSVT_2022_3199496 crossref_primary_10_1109_TDSC_2024_3363692 crossref_primary_10_1109_TFUZZ_2020_3002431 crossref_primary_10_1007_s10489_020_01989_6 crossref_primary_10_1109_TCSVT_2024_3370600 crossref_primary_10_1016_j_engappai_2024_108528 crossref_primary_10_1109_TGRS_2023_3347618 crossref_primary_10_1109_TNNLS_2023_3238729 crossref_primary_10_1007_s40815_020_00957_z crossref_primary_10_1109_TGRS_2024_3423716 crossref_primary_10_1007_s11760_022_02438_2 crossref_primary_10_1109_ACCESS_2023_3347634 crossref_primary_10_1109_TCSVT_2023_3292519 crossref_primary_10_1109_TCSVT_2023_3238804 crossref_primary_10_1109_TGRS_2024_3477933 crossref_primary_10_3390_electronics10080978 crossref_primary_10_1109_TCSVT_2022_3227716 crossref_primary_10_1016_j_engappai_2023_107125 crossref_primary_10_1109_TCSVT_2023_3343495 crossref_primary_10_3390_s21030840 crossref_primary_10_1007_s10845_022_02066_0 crossref_primary_10_1109_TCSVT_2023_3274168 crossref_primary_10_1109_ACCESS_2025_3529528 crossref_primary_10_1109_JSTARS_2024_3493606 crossref_primary_10_1155_2023_2132148 crossref_primary_10_1109_JSTARS_2023_3347561 crossref_primary_10_1109_TGRS_2024_3368509 crossref_primary_10_1109_TCSVT_2023_3236636 crossref_primary_10_1109_TCSVT_2023_3245584 crossref_primary_10_3390_app14041599 crossref_primary_10_1016_j_isprsjprs_2025_02_007 crossref_primary_10_1109_TCSVT_2024_3367666 |
Cites_doi | 10.1109/CVPR.2019.00534 10.1109/TCYB.2018.2815559 10.1007/978-3-030-01270-0_35 10.1016/j.imavis.2017.01.012 10.15607/RSS.2018.XIV.002 10.1109/ACCESS.2018.2808938 10.1007/s13042-016-0634-8 10.1109/CVPR.2014.242 10.1007/978-1-4615-5529-2_1 10.1109/ICCV.2017.324 10.1016/j.knosys.2016.11.001 10.1109/TGRS.2020.2964679 10.1109/CVPR.2018.00294 10.1007/s11263-015-0816-y 10.1109/TKDE.2009.191 10.1007/978-3-030-01270-0_44 10.1016/j.sigpro.2017.07.015 10.1109/TMI.2016.2528162 10.1109/CVPR.2016.90 10.1109/CVPR.2015.7298682 10.1109/TCYB.2019.2953922 10.1109/TFUZZ.2019.2918999 10.1007/978-3-319-66179-7_59 10.1016/0010-0285(88)90014-X 10.1109/CVPR.2019.00049 10.1109/TGRS.2019.2913095 10.1023/A:1019956318069 10.1109/TPAMI.2017.2771779 10.1109/ICIP.2017.8297117 10.1109/ICIP.2018.8451346 10.1109/CVPR.2018.00131 10.1109/CVPR.2017.106 10.1109/TIP.2019.2910052 10.1109/TPAMI.2017.2666151 10.1109/ICIP.2019.8803599 10.1145/3178876.3186154 |
ContentType | Journal Article |
Copyright | Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2021 |
Copyright_xml | – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2021 |
DBID | 97E RIA RIE AAYXX CITATION 7SC 7SP 8FD JQ2 L7M L~C L~D |
DOI | 10.1109/TCSVT.2020.2995754 |
DatabaseName | IEEE All-Society Periodicals Package (ASPP) 2005–Present IEEE All-Society Periodicals Package (ASPP) 1998–Present IEEE Electronic Library (IEL) CrossRef Computer and Information Systems Abstracts Electronics & Communications Abstracts Technology Research Database ProQuest Computer Science Collection Advanced Technologies Database with Aerospace Computer and Information Systems Abstracts Academic Computer and Information Systems Abstracts Professional |
DatabaseTitle | CrossRef Technology Research Database Computer and Information Systems Abstracts – Academic Electronics & Communications Abstracts ProQuest Computer Science Collection Computer and Information Systems Abstracts Advanced Technologies Database with Aerospace Computer and Information Systems Abstracts Professional |
DatabaseTitleList | Technology Research Database |
Database_xml | – sequence: 1 dbid: RIE name: IEEE Electronic Library (IEL) url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/ sourceTypes: Publisher |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Engineering |
EISSN | 1558-2205 |
EndPage | 1102 |
ExternalDocumentID | 10_1109_TCSVT_2020_2995754 9097252 |
Genre | orig-research |
GrantInformation_xml | – fundername: Fundamental Research Funds for the Central Universities grantid: G2019KY05301 funderid: 10.13039/501100012226 – fundername: National Natural Science Foundation of China grantid: 61901376 funderid: 10.13039/501100001809 – fundername: Peak Experience Plan in Northwestern Polytechnical University funderid: 10.13039/501100002663 |
GroupedDBID | -~X 0R~ 29I 4.4 5GY 5VS 6IK 97E AAJGR AARMG AASAJ AAWTH ABAZT ABQJQ ABVLG ACGFO ACGFS ACIWK AENEX AETIX AGQYO AGSQL AHBIQ AI. AIBXA AKJIK AKQYR ALLEH ALMA_UNASSIGNED_HOLDINGS ASUFR ATWAV BEFXN BFFAM BGNUA BKEBE BPEOZ CS3 DU5 EBS EJD HZ~ H~9 ICLAB IFIPE IFJZH IPLJI JAVBF LAI M43 O9- OCL P2P RIA RIE RNS RXW TAE TN5 VH1 AAYXX CITATION RIG 7SC 7SP 8FD JQ2 L7M L~C L~D |
ID | FETCH-LOGICAL-c295t-9417e1733f997e59fde859f4e49acf2ad99b850b1b058e5b5a65cb781a203eb03 |
IEDL.DBID | RIE |
ISSN | 1051-8215 |
IngestDate | Mon Jun 30 06:49:37 EDT 2025 Tue Jul 01 00:41:13 EDT 2025 Thu Apr 24 23:08:00 EDT 2025 Wed Aug 27 02:48:57 EDT 2025 |
IsPeerReviewed | true |
IsScholarly | true |
Issue | 3 |
Language | English |
License | https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html https://doi.org/10.15223/policy-029 https://doi.org/10.15223/policy-037 |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-c295t-9417e1733f997e59fde859f4e49acf2ad99b850b1b058e5b5a65cb781a203eb03 |
Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
ORCID | 0000-0001-8181-7001 0000-0001-5429-2748 0000-0003-4858-823X |
PQID | 2498872876 |
PQPubID | 85433 |
PageCount | 12 |
ParticipantIDs | ieee_primary_9097252 crossref_primary_10_1109_TCSVT_2020_2995754 proquest_journals_2498872876 crossref_citationtrail_10_1109_TCSVT_2020_2995754 |
ProviderPackageCode | CITATION AAYXX |
PublicationCentury | 2000 |
PublicationDate | 2021-03-01 |
PublicationDateYYYYMMDD | 2021-03-01 |
PublicationDate_xml | – month: 03 year: 2021 text: 2021-03-01 day: 01 |
PublicationDecade | 2020 |
PublicationPlace | New York |
PublicationPlace_xml | – name: New York |
PublicationTitle | IEEE transactions on circuits and systems for video technology |
PublicationTitleAbbrev | TCSVT |
PublicationYear | 2021 |
Publisher | IEEE The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Publisher_xml | – name: IEEE – name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
References | ref57 ref13 ref14 ref53 mishkin (ref31) 2015 ref19 ref18 ren (ref54) 2018 rusu (ref28) 2019 ref51 weinberger (ref49) 2009; 10 ref50 mishra (ref44) 2018 ref46 ref45 ref48 ref47 finn (ref15) 2017 li (ref52) 2019 ref8 ref7 ref9 ref3 ref6 ref40 ref35 ref34 ref37 finn (ref16) 2017 ref36 ref30 ref33 ref32 ref2 yosinski (ref39) 2014 ref38 liu (ref56) 2018 ravi (ref5) 2016 munkhdalai (ref55) 2018 long (ref10) 2017 ref24 ref23 ref26 ref25 ref20 ref22 ref21 zhou (ref43) 2018 kim (ref41) 2018 ref27 ref29 santoro (ref17) 2016 finn (ref42) 2018 snell (ref4) 2017 vinyals (ref12) 2016 koch (ref11) 2015; 2 simonyan (ref1) 2014 |
References_xml | – ident: ref30 doi: 10.1109/CVPR.2019.00534 – ident: ref23 doi: 10.1109/TCYB.2018.2815559 – start-page: 7332 year: 2018 ident: ref41 article-title: Bayesian model-agnostic meta-learning publication-title: Proc Adv Neural Inf Process Syst – ident: ref25 doi: 10.1007/978-3-030-01270-0_35 – ident: ref40 doi: 10.1016/j.imavis.2017.01.012 – ident: ref14 doi: 10.15607/RSS.2018.XIV.002 – ident: ref36 doi: 10.1109/ACCESS.2018.2808938 – start-page: 3320 year: 2014 ident: ref39 article-title: How transferable are features in deep neural networks? publication-title: Proc Adv Neural Inf Process Syst – start-page: 357 year: 2017 ident: ref15 article-title: One-shot visual imitation learning via meta-learning publication-title: Proc Conf Rob Learn (CoRL) – ident: ref32 doi: 10.1007/s13042-016-0634-8 – start-page: 1842 year: 2016 ident: ref17 article-title: Meta-learning with memory-augmented neural networks publication-title: Proc Int Conf Mach Learn – ident: ref50 doi: 10.1109/CVPR.2014.242 – ident: ref3 doi: 10.1007/978-1-4615-5529-2_1 – year: 2014 ident: ref1 article-title: Very deep convolutional networks for large-scale image recognition publication-title: arXiv 1409 1556 – ident: ref57 doi: 10.1109/ICCV.2017.324 – ident: ref18 doi: 10.1016/j.knosys.2016.11.001 – ident: ref35 doi: 10.1109/TGRS.2020.2964679 – start-page: 1 year: 2019 ident: ref28 article-title: Meta-learning with latent embedding optimization publication-title: Proc Int Conf Learn Represent (ICLR) – ident: ref29 doi: 10.1109/CVPR.2018.00294 – ident: ref53 doi: 10.1007/s11263-015-0816-y – ident: ref7 doi: 10.1109/TKDE.2009.191 – ident: ref45 doi: 10.1007/978-3-030-01270-0_44 – ident: ref47 doi: 10.1016/j.sigpro.2017.07.015 – year: 2018 ident: ref54 article-title: Meta-learning for semi-supervised few-shot classification publication-title: Proc Int Conf Learn Represent (ICLR) – ident: ref8 doi: 10.1109/TMI.2016.2528162 – ident: ref2 doi: 10.1109/CVPR.2016.90 – year: 2018 ident: ref43 article-title: Deep meta-learning: Learning to learn in the concept space publication-title: arXiv 1802 03596 – ident: ref51 doi: 10.1109/CVPR.2015.7298682 – start-page: 3661 year: 2018 ident: ref55 article-title: Rapid adaptation with conditionally shifted neurons publication-title: Proc Int Conf Mach Learn (ICML) – start-page: 1 year: 2018 ident: ref44 article-title: A simple neural attentive meta-learner publication-title: Proc Int Conf Learn Represent (ICLR) – ident: ref22 doi: 10.1109/TCYB.2019.2953922 – ident: ref19 doi: 10.1109/TFUZZ.2019.2918999 – start-page: 2208 year: 2017 ident: ref10 article-title: Deep transfer learning with joint adaptation networks publication-title: Proc 34th Int Conf Mach Learn (ICML) – ident: ref37 doi: 10.1007/978-3-319-66179-7_59 – ident: ref9 doi: 10.1016/0010-0285(88)90014-X – ident: ref27 doi: 10.1109/CVPR.2019.00049 – year: 2019 ident: ref52 article-title: Revisiting metric learning for few-shot image classification publication-title: arXiv 1907 03123 – ident: ref34 doi: 10.1109/TGRS.2019.2913095 – ident: ref13 doi: 10.1023/A:1019956318069 – ident: ref33 doi: 10.1109/TPAMI.2017.2771779 – ident: ref38 doi: 10.1109/ICIP.2017.8297117 – start-page: 3630 year: 2016 ident: ref12 article-title: Matching networks for one shot learning publication-title: Proc Adv Neural Inf Process Syst – start-page: 9516 year: 2018 ident: ref42 article-title: Probabilistic model-agnostic meta-learning publication-title: Proc Adv Neural Inf Process Syst – ident: ref24 doi: 10.1109/ICIP.2018.8451346 – start-page: 1126 year: 2017 ident: ref16 article-title: Model-agnostic meta-learning for fast adaptation of deep networks publication-title: Proc 34th Int Conf Mach Learn (ICML) – year: 2015 ident: ref31 article-title: All you need is a good init publication-title: arXiv 1511 06422 – start-page: 4077 year: 2017 ident: ref4 article-title: Prototypical networks for few-shot learning publication-title: Proc Adv Neural Inf Process Syst – start-page: 1 year: 2016 ident: ref5 article-title: Optimization as a model for few-shot learning publication-title: Proc Int Conf Learn Represent (ICLR) – ident: ref6 doi: 10.1109/CVPR.2018.00131 – ident: ref20 doi: 10.1109/CVPR.2017.106 – volume: 10 start-page: 207 year: 2009 ident: ref49 article-title: Distance metric learning for large margin nearest neighbor classification publication-title: J Mach Learn Res – ident: ref26 doi: 10.1109/TIP.2019.2910052 – volume: 2 start-page: 1 year: 2015 ident: ref11 article-title: Siamese neural networks for one-shot image recognition publication-title: Proc Int Conf Mach Learn Workshops (ICMLW) – ident: ref46 doi: 10.1109/TPAMI.2017.2666151 – ident: ref21 doi: 10.1109/ICIP.2019.8803599 – start-page: 1 year: 2018 ident: ref56 article-title: Learning to propagate labels: Transductive propagation network for few-shot learning publication-title: Proc Int Conf Learn Represent (ICLR) – ident: ref48 doi: 10.1145/3178876.3186154 |
SSID | ssj0014847 |
Score | 2.6860666 |
Snippet | Few-shot learning in image classification is developed to learn a model that aims to identify unseen classes with only few training samples for each class.... |
SourceID | proquest crossref ieee |
SourceType | Aggregation Database Enrichment Source Index Database Publisher |
StartPage | 1091 |
SubjectTerms | Classification Feature extraction Few-shot learning Image classification Learning Learning systems Measurement metric learning multi-scale feature maps Neural networks Semantics Task analysis Training |
Title | Multi-Scale Metric Learning for Few-Shot Learning |
URI | https://ieeexplore.ieee.org/document/9097252 https://www.proquest.com/docview/2498872876 |
Volume | 31 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1LSwMxEA61Jz34qmK1yh686bbJ7mazOUqxFKFe2kpvS5KdVVBa0S2Cv95J9oEvxMsSlgTCfMlkJpn5hpBzFuWBQZx9FVN7WyWwlYXgiyBTPGNMgKuiMLmNx_PoZsEXLXLZ5MIAgAs-g75turf8bGXW9qpsIC3XDEeFu4GOW5mr1bwYRIkrJobmAvMTPMfqBBkqB7Ph9G6GrmBA-6h80T6JvhxCrqrKD1XszpfRDpnUMyvDSh7760L3zfs30sb_Tn2XbFeGpndVrow90oLlPtn6RD_YIcxl3_pThAm8iS2tZbyKb_XeQ2PWG8GbP31YFc3fAzIfXc-GY78qoeCbQPLClxFKm4kwzKUUwGWeQYLfCCKpTB6oTEqdcKqZpjwBrrmKudEiYSqgIWgaHpL2crWEI-Kh6syVFCqhOZoEmVSBotQIpU0MWWjyLmG1TFNT8YvbMhdPqfMzqEwdDqnFIa1w6JKLZsxzya7xZ--OFWzTs5Jpl_Rq6NJqA76m6FWi-kR3MD7-fdQJ2QxseIoLJ-uRdvGyhlO0Lwp95hbWB61uyf0 |
linkProvider | IEEE |
linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwzV3NTtwwEB5ROLQ9AC2turClObQnlMV24iQ-cEDbrpaf5bIL4pbazqRIVLsVZIXgWXgV3o2xNxuVUnFD4hJZkZ1Y_kaeb-z5AfjK41JYwjnUCXOnVSm1igjDVBRaFpyn6KsoDI6S_nG8fypPF-C2iYVBRO98hh3X9Hf5xcRO3VHZtnK5ZqSoXSgP8PqKDLTLnb3vhOY3IXo_Rt1-WNcQCK1QsgpVTL_jaRSVSqUoVVlgRs8YY6VtKXShlMkkM9wwmaE0UifSmjTjWrAIDYvou69giXiGFLPosOaOIs58-TIiKDzMSHPOQ3KY2h51hycjMj4F69B2T4wofqD2fB2XR5u_12i9Fbibr8XMkeW8M61Mx978kybypS7WKizXVDrYncn-O1jA8Xt4-1eCxTXgPr44HJIgYjBwxcNsUGeU_RUQXQ96eBUOzyZV8_YDHD_LnD_C4ngyxk8QkHIotUp1xkoiPYXSQjNmU21sgkVkyxbwOYa5rTOou0Iev3NvSTGVe9xzh3te496CrWbMn1n-kCd7rzkgm541hi1oz0Ulr7eYy5zsZlIQZPAm6_8f9QVe90eDw_xw7-hgA94I54zjnefasFhdTPEzsanKbHqhDuDncwvGPdEoJ44 |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Multi-Scale+Metric+Learning+for+Few-Shot+Learning&rft.jtitle=IEEE+transactions+on+circuits+and+systems+for+video+technology&rft.au=Jiang%2C+Wen&rft.au=Huang%2C+Kai&rft.au=Geng%2C+Jie&rft.au=Deng%2C+Xinyang&rft.date=2021-03-01&rft.pub=IEEE&rft.issn=1051-8215&rft.volume=31&rft.issue=3&rft.spage=1091&rft.epage=1102&rft_id=info:doi/10.1109%2FTCSVT.2020.2995754&rft.externalDocID=9097252 |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1051-8215&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1051-8215&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1051-8215&client=summon |