A Knowledge-Enhanced Pretraining Model for Commonsense Story Generation
Story generation, namely, generating a reasonable story from a leading context, is an important but challenging task. In spite of the success in modeling fluency and local coherence, existing neural language generation models (e.g., GPT-2) still suffer from repetition, logic conflicts, and lack of l...
Saved in:
Published in | Transactions of the Association for Computational Linguistics Vol. 8; pp. 93 - 108 |
---|---|
Main Authors | , , , , |
Format | Journal Article |
Language | English |
Published |
One Rogers Street, Cambridge, MA 02142-1209, USA
MIT Press
01.01.2020
MIT Press Journals, The The MIT Press |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Be the first to leave a comment!