A Generative Model for Joint Natural Language Understanding and Generation
Natural language understanding (NLU) and natural language generation (NLG) are two fundamental and related tasks in building task-oriented dialogue systems with opposite objectives: NLU tackles the transformation from natural language to formal representations, whereas NLG does the reverse. A key to...
Saved in:
Main Authors | , , , |
---|---|
Format | Journal Article |
Language | English |
Published |
12.06.2020
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Natural language understanding (NLU) and natural language generation (NLG)
are two fundamental and related tasks in building task-oriented dialogue
systems with opposite objectives: NLU tackles the transformation from natural
language to formal representations, whereas NLG does the reverse. A key to
success in either task is parallel training data which is expensive to obtain
at a large scale. In this work, we propose a generative model which couples NLU
and NLG through a shared latent variable. This approach allows us to explore
both spaces of natural language and formal representations, and facilitates
information sharing through the latent space to eventually benefit NLU and NLG.
Our model achieves state-of-the-art performance on two dialogue datasets with
both flat and tree-structured formal representations. We also show that the
model can be trained in a semi-supervised fashion by utilising unlabelled data
to boost its performance. |
---|---|
DOI: | 10.48550/arxiv.2006.07499 |