ConvTKG: A query-aware convolutional neural network-based embedding model for temporal knowledge graph completion

Temporal knowledge graphs (TKGs) are still far from completeness although they have benefited many artificial intelligence applications successfully. To alleviate the incompleteness of TKGs, temporal knowledge graph completion (TKGC) aims to perform link prediction and reason new time-sensitive fact...

Full description

Saved in:
Bibliographic Details
Published inNeurocomputing (Amsterdam) Vol. 588; p. 127680
Main Authors He, Mingsheng, Zhu, Lin, Bai, Luyi
Format Journal Article
LanguageEnglish
Published Elsevier B.V 01.07.2024
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Temporal knowledge graphs (TKGs) are still far from completeness although they have benefited many artificial intelligence applications successfully. To alleviate the incompleteness of TKGs, temporal knowledge graph completion (TKGC) aims to perform link prediction and reason new time-sensitive facts from existing facts. Among the various TKGC methods, temporal knowledge graph embedding (TKGE) methods represent entities, relations, and timestamps in low-dimensional spaces, which are known for their high reasoning performance and robust scalability. Some previous researches on TKGE aim to endow the embedding quadruple with the translational characteristic by modeling the global relationship in the embedding quadruple. However, the weights of same dimensional entries of the embedding quadruple are fixed and unique, which limits the ability to capture diverse global relationships. Besides, most TKGE models view timestamps as a kind of general feature and overlook their interrelations and correlations with the queries. Moreover, the distinct characteristics of an entity as the subject entity and the object entity of facts have not been fully leveraged. In this paper, we propose a query-aware embedding model based on the convolutional neural network (CNN) named ConvTKG to perform TKGE, thus tackling the TKGC task. We design a new temporal information encoder based on the Gated Recurrent Unit (GRU) and attention mechanism to learn the query-aware representation of temporal information. More importantly, we utilize a CNN-based decoder where multiple one-dimensional convolution kernels are operated on the matrix composed of entity embedding, relation embedding, and temporal representation to capture global relationships among them. In addition, we assign two independent vectors for each entity and take advantage of the inverse relation to allow them to be learned dependently. Experiments show that ConvTKG achieves better link prediction performance than previous state-of-the-art baseline methods for TKGC on three benchmark datasets: ICEWS14, ICEWS05-15, and GDELT. •We propose the first weight-adaptive TKGE model with translational characteristic.•We introduce a new query-aware timestamp encoding method for TKGC.•Two dependent embedding vectors for each entity capture its distinct semantics.•Experiments demonstrate the effectiveness of ConvTKG on three TKGC benchmarks.
ISSN:0925-2312
1872-8286
DOI:10.1016/j.neucom.2024.127680