Contrastive Meta Learning for Soft Prompts Using Dynamic Mixup
Soft prompting is crucial in few-shot settings which are essential to carry out parameter efficient learning for natural language understanding (NLU). A new trend of solutions has been recently developed by merging meta learning schemes over different domains. This study presents a new metric-based...
Saved in:
Published in | 2024 International Joint Conference on Neural Networks (IJCNN) pp. 1 - 6 |
---|---|
Main Authors | , , |
Format | Conference Proceeding |
Language | English |
Published |
IEEE
30.06.2024
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Soft prompting is crucial in few-shot settings which are essential to carry out parameter efficient learning for natural language understanding (NLU). A new trend of solutions has been recently developed by merging meta learning schemes over different domains. This study presents a new metric-based meta learning where the contrastive perspective is implemented to enhance the discrimination among confusing classes, and accordingly a rapid domain adaptation is feasible to work for unseen tasks in text classification. To further address the generalization issue, this paper proposes a mixup augmentation where the mixup ratio is automatically estimated according to the maximum entropy principle. The experimental results on a series of NLU tasks show the merit of the proposed method in terms of classification accuracy, parameter efficiency and latent visualization in most of few-shot settings. |
---|---|
ISSN: | 2161-4407 |
DOI: | 10.1109/IJCNN60899.2024.10651477 |