AMR-To-Text Generation with Graph Transformer
Abstract meaning representation (AMR)-to-text generation is the challenging task of generating natural language texts from AMR graphs, where nodes represent concepts and edges denote relations. The current state-of-the-art methods use graph-to-sequence models; however, they still cannot significantl...
Saved in:
Published in | Transactions of the Association for Computational Linguistics Vol. 8; pp. 19 - 33 |
---|---|
Main Authors | , , |
Format | Journal Article |
Language | English |
Published |
One Rogers Street, Cambridge, MA 02142-1209, USA
MIT Press
01.01.2020
MIT Press Journals, The The MIT Press |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Abstract meaning representation (AMR)-to-text generation is the challenging task
of generating natural language texts from AMR graphs, where nodes represent
concepts and edges denote relations. The current state-of-the-art methods use
graph-to-sequence models; however, they still cannot significantly outperform
the previous sequence-to-sequence models or statistical approaches. In this
paper, we propose a novel graph-to-sequence model (Graph Transformer) to address
this task. The model directly encodes the AMR graphs and learns the node
representations. A pairwise interaction function is used for computing the
semantic relations between the concepts. Moreover, attention mechanisms are used
for aggregating the information from the incoming and outgoing neighbors, which
help the model to capture the semantic information effectively. Our model
outperforms the state-of-the-art neural approach by 1.5 BLEU points on
LDC2015E86 and 4.8 BLEU points on LDC2017T10 and achieves new state-of-the-art
performances. |
---|---|
Bibliography: | Volume, 2020 ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
ISSN: | 2307-387X 2307-387X |
DOI: | 10.1162/tacl_a_00297 |