Multilingual Denoising Pre-training for Neural Machine Translation

This paper demonstrates that multilingual denoising pre-training produces significant performance gains across a wide variety of machine translation (MT) tasks. We present —a sequence-to-sequence denoising auto-encoder pre-trained on large-scale monolingual corpora in many languages using the BART o...

Full description

Saved in:
Bibliographic Details
Published inTransactions of the Association for Computational Linguistics Vol. 8; pp. 726 - 742
Main Authors Liu, Yinhan, Gu, Jiatao, Goyal, Naman, Li, Xian, Edunov, Sergey, Ghazvininejad, Marjan, Lewis, Mike, Zettlemoyer, Luke
Format Journal Article
LanguageEnglish
Published One Rogers Street, Cambridge, MA 02142-1209, USA MIT Press 01.01.2020
MIT Press Journals, The
The MIT Press
Subjects
Online AccessGet full text

Cover

Loading…