A Knowledge-Enhanced Pretraining Model for Commonsense Story Generation

Story generation, namely, generating a reasonable story from a leading context, is an important but challenging task. In spite of the success in modeling fluency and local coherence, existing neural language generation models (e.g., GPT-2) still suffer from repetition, logic conflicts, and lack of l...

Full description

Saved in:
Bibliographic Details
Published inTransactions of the Association for Computational Linguistics Vol. 8; pp. 93 - 108
Main Authors Guan, Jian, Huang, Fei, Zhao, Zhihao, Zhu, Xiaoyan, Huang, Minlie
Format Journal Article
LanguageEnglish
Published One Rogers Street, Cambridge, MA 02142-1209, USA MIT Press 01.01.2020
MIT Press Journals, The
The MIT Press
Subjects
Online AccessGet full text

Cover

Loading…