Abstractive Dialogue Summarization Based on Dynamic Pattern Exploiting Training
Pre-trained language models (PLMs) have shown remarkable performance in natural language processing tasks, whereas these approaches often require a massive amount of data. Due to the lack of sufficient training instances, it is challenging for existing PLMs to achieve good results on dialogue summar...
Saved in:
Published in | 2022 International Joint Conference on Neural Networks (IJCNN) pp. 1 - 7 |
---|---|
Main Authors | , , , |
Format | Conference Proceeding |
Language | English |
Published |
IEEE
18.07.2022
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Pre-trained language models (PLMs) have shown remarkable performance in natural language processing tasks, whereas these approaches often require a massive amount of data. Due to the lack of sufficient training instances, it is challenging for existing PLMs to achieve good results on dialogue summarization. In this paper, we propose DynamicPET, a pattern-exploiting training (PET) based method for abstractive dialogue summarization, which leverages the recent prompt learning paradigm to boost the performance of PLMs. In contrast to PET, our method does not rely on any task-specific unlabeled data, but obtains strong performance on two dialogue summarization datasets, especially in the few-shot scenarios. |
---|---|
ISSN: | 2161-4407 |
DOI: | 10.1109/IJCNN55064.2022.9892323 |