Contrastive Meta Learning for Soft Prompts Using Dynamic Mixup

Soft prompting is crucial in few-shot settings which are essential to carry out parameter efficient learning for natural language understanding (NLU). A new trend of solutions has been recently developed by merging meta learning schemes over different domains. This study presents a new metric-based...

Full description

Saved in:
Bibliographic Details
Published in2024 International Joint Conference on Neural Networks (IJCNN) pp. 1 - 6
Main Authors Chien, Jen-Tzung, Wang, Hsin-Ti, Lee, Ching-Hsien
Format Conference Proceeding
LanguageEnglish
Published IEEE 30.06.2024
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Soft prompting is crucial in few-shot settings which are essential to carry out parameter efficient learning for natural language understanding (NLU). A new trend of solutions has been recently developed by merging meta learning schemes over different domains. This study presents a new metric-based meta learning where the contrastive perspective is implemented to enhance the discrimination among confusing classes, and accordingly a rapid domain adaptation is feasible to work for unseen tasks in text classification. To further address the generalization issue, this paper proposes a mixup augmentation where the mixup ratio is automatically estimated according to the maximum entropy principle. The experimental results on a series of NLU tasks show the merit of the proposed method in terms of classification accuracy, parameter efficiency and latent visualization in most of few-shot settings.
ISSN:2161-4407
DOI:10.1109/IJCNN60899.2024.10651477