Adaptive prototype few-shot image classification method based on feature pyramid

Few-shot learning aims to enable machines to recognize unseen novel classes using limited samples akin to human capabilities. Metric learning is a crucial approach to addressing this challenge, with its performance primarily dependent on the effectiveness of feature extraction and prototype computat...

Full description

Saved in:
Bibliographic Details
Published inPeerJ. Computer science Vol. 10; p. e2322
Main Authors Shen, Linshan, Feng, Xiang, Xu, Li, Ding, Weiyue
Format Journal Article
LanguageEnglish
Published United States PeerJ. Ltd 01.10.2024
PeerJ Inc
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Few-shot learning aims to enable machines to recognize unseen novel classes using limited samples akin to human capabilities. Metric learning is a crucial approach to addressing this challenge, with its performance primarily dependent on the effectiveness of feature extraction and prototype computation. This article introduces an Adaptive Prototype few-shot image classification method based on Feature Pyramid (APFP). APFP employs a novel feature extraction method called FResNet, which builds upon the ResNet architecture and leverages a feature pyramid structure to retain finer details. In the 5-shot scenario, traditional methods for computing average prototypes exhibit limitations due to the typically diverse and uneven distribution of samples, where simple means may inadequately reflect such diversity. To address this issue, APFP proposes an Adaptive Prototype method (AP) that dynamically computes class prototypes of the support set based on the similarity between support set samples and query samples. Experimental results demonstrate that APFP achieves 67.98% and 85.32% accuracy in the 5-way 1-shot and 5-way 5-shot scenarios on the MiniImageNet dataset, respectively, and 84.02% and 94.44% accuracy on the CUB dataset. These results indicate that the proposed APFP method addresses the few-shot learning problem.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:2376-5992
2376-5992
DOI:10.7717/peerj-cs.2322