SeDyT: A General Framework for Multi-Step Event Forecasting via Sequence Modeling on Dynamic Entity Embeddings

Temporal Knowledge Graphs store events in the form of subjects, relations, objects, and timestamps which are often represented by dynamic heterogeneous graphs. Event forecasting is a critical and challenging task in Temporal Knowledge Graph reasoning that predicts the subject or object of an event in the future. To obtain temporal embeddings multi-step away in the future, existing methods learn generative models that capture the joint distribution of the observed events. To reduce the high computation costs, these methods rely on unrealistic assumptions of independence and approximations in training and inference. In this work, we propose SeDyT, a discriminative framework that performs sequence modeling on the dynamic entity embeddings to solve the multi-step event forecasting problem. SeDyT consists of two components: a Temporal Graph Neural Network that generates dynamic entity embeddings in the past and a sequence model that predicts the entity embeddings in the future. Compared with the generative models, SeDyT does not rely on any heuristic-based probability model and has low computation complexity in both training and inference. SeDyT is compatible with most Temporal Graph Neural Networks and sequence models. We also design an efficient training method that trains the two components in one gradient descent propagation. We evaluate the performance of SeDyT on five popular datasets. By combining temporal Graph Neural Network models and sequence models, SeDyT achieves an average of 2.4% MRR improvement when not using the validation set and more than 10% MRR improvement when using the validation set.

[1]  Fotios Petropoulos,et al.  Forecasting the novel coronavirus COVID-19 , 2020, PloS one.

[2]  Max Welling,et al.  Semi-Supervised Classification with Graph Convolutional Networks , 2016, ICLR.

[3]  Davide Eynard,et al.  Temporal Graph Networks for Deep Learning on Dynamic Graphs , 2020, ArXiv.

[4]  F. Willekens,et al.  The global economic crisis and international migration: An uncertain outlook , 2009 .

[5]  Da Xu,et al.  Inductive Representation Learning on Temporal Graphs , 2020, ICLR.

[6]  Ken-ichi Kawarabayashi,et al.  Representation Learning on Graphs with Jumping Knowledge Networks , 2018, ICML.

[7]  Jie Chen,et al.  EvolveGCN: Evolving Graph Convolutional Networks for Dynamic Graphs , 2020, AAAI.

[8]  Partha Talukdar,et al.  HyTE: Hyperplane-based Temporally aware Knowledge Graph Embedding , 2018, EMNLP.

[9]  Jure Leskovec,et al.  Inductive Representation Learning in Temporal Networks via Causal Anonymous Walks , 2021, ICLR.

[10]  Changjun Fan,et al.  Learning from History: Modeling Temporal Knowledge Graphs with Sequential Copy-Generation Networks , 2021, AAAI.

[11]  Le Song,et al.  Know-Evolve: Deep Temporal Reasoning for Dynamic Knowledge Graphs , 2017, ICML.

[12]  Julien Leblay,et al.  Deriving Validity Time in Knowledge Graph , 2018, WWW.

[13]  Ambedkar Dukkipati,et al.  Neural Latent Space Model for Dynamic Networks and Temporal Knowledge Graphs , 2021, AAAI.

[14]  Kristina Lerman,et al.  MixHop: Higher-Order Graph Convolutional Architectures via Sparsified Neighborhood Mixing , 2019, ICML.

[15]  Arnold P. Boedihardjo,et al.  Forecasting location-based events with spatio-temporal storytelling , 2014, LBSN '14.

[16]  Lingfan Yu,et al.  Deep Graph Library: A Graph-Centric, Highly-Performant Package for Graph Neural Networks. , 2019 .

[17]  Xavier Bresson,et al.  Structured Sequence Modeling with Graph Convolutional Recurrent Networks , 2016, ICONIP.

[18]  Jimmy Ba,et al.  Adam: A Method for Stochastic Optimization , 2014, ICLR.

[19]  Max Welling,et al.  Modeling Relational Data with Graph Convolutional Networks , 2017, ESWC.

[20]  Lukasz Kaiser,et al.  Attention is All you Need , 2017, NIPS.

[21]  Xiang Ren,et al.  Recurrent Event Network: Autoregressive Structure Inference over Temporal Knowledge Graphs , 2019, EMNLP.

[22]  Jason Weston,et al.  Translating Embeddings for Modeling Multi-relational Data , 2013, NIPS.

[23]  Fabian M. Suchanek,et al.  YAGO3: A Knowledge Base from Multilingual Wikipedias , 2015, CIDR.

[24]  Yoshua Bengio,et al.  Learning Phrase Representations using RNN Encoder–Decoder for Statistical Machine Translation , 2014, EMNLP.

[25]  Jian-Yun Nie,et al.  RotatE: Knowledge Graph Embedding by Relational Rotation in Complex Space , 2018, ICLR.

[26]  Hamed Shariat Yazdi,et al.  TeRo: A Time-aware Knowledge Graph Embedding via Temporal Rotation , 2020, COLING.

[27]  Nitesh V. Chawla,et al.  Heterogeneous Graph Neural Network , 2019, KDD.