Event Transition Planning for Open-ended Text Generation

Open-ended text generation tasks, such as dialogue generation and story completion, require models to generate a coherent continuation given limited preceding context. The open-ended nature of these tasks brings new challenges to the neural auto-regressive text generators nowadays. Despite these neu...

Full description

Saved in:
Bibliographic Details
Main Authors Li, Qintong, Li, Piji, Bi, Wei, Ren, Zhaochun, Lai, Yuxuan, Kong, Lingpeng
Format Journal Article
LanguageEnglish
Published 20.04.2022
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Open-ended text generation tasks, such as dialogue generation and story completion, require models to generate a coherent continuation given limited preceding context. The open-ended nature of these tasks brings new challenges to the neural auto-regressive text generators nowadays. Despite these neural models are good at producing human-like text, it is difficult for them to arrange causalities and relations between given facts and possible ensuing events. To bridge this gap, we propose a novel two-stage method which explicitly arranges the ensuing events in open-ended text generation. Our approach can be understood as a specially-trained coarse-to-fine algorithm, where an event transition planner provides a "coarse" plot skeleton and a text generator in the second stage refines the skeleton. Experiments on two open-ended text generation tasks demonstrate that our proposed method effectively improves the quality of the generated text, especially in coherence and diversity. The code is available at: \url{https://github.com/qtli/EventPlanforTextGen}.
DOI:10.48550/arxiv.2204.09453