GETT-QA: Graph Embedding based T2T Transformer for Knowledge Graph Question Answering
In this work, we present an end-to-end Knowledge Graph Question Answering (KGQA) system named GETT-QA. GETT-QA uses T5, a popular text-to-text pre-trained language model. The model takes a question in natural language as input and produces a simpler form of the intended SPARQL query. In the simpler...
Saved in:
Main Authors | , , , |
---|---|
Format | Journal Article |
Language | English |
Published |
23.03.2023
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | In this work, we present an end-to-end Knowledge Graph Question Answering
(KGQA) system named GETT-QA. GETT-QA uses T5, a popular text-to-text
pre-trained language model. The model takes a question in natural language as
input and produces a simpler form of the intended SPARQL query. In the simpler
form, the model does not directly produce entity and relation IDs. Instead, it
produces corresponding entity and relation labels. The labels are grounded to
KG entity and relation IDs in a subsequent step. To further improve the
results, we instruct the model to produce a truncated version of the KG
embedding for each entity. The truncated KG embedding enables a finer search
for disambiguation purposes. We find that T5 is able to learn the truncated KG
embeddings without any change of loss function, improving KGQA performance. As
a result, we report strong results for LC-QuAD 2.0 and SimpleQuestions-Wikidata
datasets on end-to-end KGQA over Wikidata. |
---|---|
DOI: | 10.48550/arxiv.2303.13284 |