Abstractive Dialogue Summarization Based on Dynamic Pattern Exploiting Training

Pre-trained language models (PLMs) have shown remarkable performance in natural language processing tasks, whereas these approaches often require a massive amount of data. Due to the lack of sufficient training instances, it is challenging for existing PLMs to achieve good results on dialogue summar...

Full description

Saved in:
Bibliographic Details
Published in2022 International Joint Conference on Neural Networks (IJCNN) pp. 1 - 7
Main Authors Chen, Zhanyi, Li, Changqun, Wang, Linlin, He, Liang
Format Conference Proceeding
LanguageEnglish
Published IEEE 18.07.2022
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Pre-trained language models (PLMs) have shown remarkable performance in natural language processing tasks, whereas these approaches often require a massive amount of data. Due to the lack of sufficient training instances, it is challenging for existing PLMs to achieve good results on dialogue summarization. In this paper, we propose DynamicPET, a pattern-exploiting training (PET) based method for abstractive dialogue summarization, which leverages the recent prompt learning paradigm to boost the performance of PLMs. In contrast to PET, our method does not rely on any task-specific unlabeled data, but obtains strong performance on two dialogue summarization datasets, especially in the few-shot scenarios.
ISSN:2161-4407
DOI:10.1109/IJCNN55064.2022.9892323