Comparative Analysis of Current Approaches to Quality Estimation for Neural Machine Translation

Quality estimation (QE) has recently gained increasing interest as it can predict the quality of machine translation results without a reference translation. QE is an annual shared task at the Conference on Machine Translation (WMT), and most recent studies have applied the multilingual pretrained l...

Full description

Saved in:
Bibliographic Details
Published inApplied sciences Vol. 11; no. 14; p. 6584
Main Authors Eo, Sugyeong, Park, Chanjun, Moon, Hyeonseok, Seo, Jaehyung, Lim, Heuiseok
Format Journal Article
LanguageEnglish
Published Basel MDPI AG 01.07.2021
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Quality estimation (QE) has recently gained increasing interest as it can predict the quality of machine translation results without a reference translation. QE is an annual shared task at the Conference on Machine Translation (WMT), and most recent studies have applied the multilingual pretrained language model (mPLM) to address this task. Recent studies have focused on the performance improvement of this task using data augmentation with finetuning based on a large-scale mPLM. In this study, we eliminate the effects of data augmentation and conduct a pure performance comparison between various mPLMs. Separate from the recent performance-driven QE research involved in competitions addressing a shared task, we utilize the comparison for sub-tasks from WMT20 and identify an optimal mPLM. Moreover, we demonstrate QE using the multilingual BART model, which has not yet been utilized, and conduct comparative experiments and analyses with cross-lingual language models (XLMs), multilingual BERT, and XLM-RoBERTa.
ISSN:2076-3417
2076-3417
DOI:10.3390/app11146584