Text-enhanced Multi-Granularity Temporal Graph Learning for Event Prediction

When working with forecasting the future, it is all about learning from the past. However, it is non-trivial to model the past due to the scale and complexity of available data. Recently, Graph Neural Networks (GNNs) have shown flexibility to process different forms of data and learn interactions am...

Full description

Saved in:
Bibliographic Details
Published in2022 IEEE International Conference on Data Mining (ICDM) pp. 171 - 180
Main Authors Han, Xiaoxue, Ning, Yue
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.11.2022
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:When working with forecasting the future, it is all about learning from the past. However, it is non-trivial to model the past due to the scale and complexity of available data. Recently, Graph Neural Networks (GNNs) have shown flexibility to process different forms of data and learn interactions among entities, giving them advantages in real-life applications. More and more researchers have started to apply GNNs and temporal models for event forecasting because events are formalized in knowledge graphs. However, most of these models are based on the Markov assumption that the probability of a event is only influenced by the state of its last time step (or recent history). We claim that the occurrence of an event not only has short-term but also long-term dependencies. In this work, we propose a temporal knowledge graph (KG)-based model that considers different granularties of histories when forecasting an event; this method also integrates news texts as auxiliary features during the graph learning process. Extensive experiments on multiple datasets are conducted to examine the effectiveness of the proposed method. Code is available at: https://github.com/yuening-lab/MTG.
ISSN:2374-8486
DOI:10.1109/ICDM54844.2022.00027