EasyDGL: Encode, Train and Interpret for Continuous-time Dynamic Graph Learning

Dynamic graphs arise in various real-world applications, and it is often welcomed to model the dynamics directly in continuous time domain for its flexibility. This paper aims to design an easy-to-use pipeline (termed as EasyDGL which is also due to its implementation by DGL toolkit) composed of three key modules with both strong fitting ability and interpretability. Specifically the proposed pipeline which involves encoding, training and interpreting: i) a temporal point process (TPP) modulated attention architecture to endow the continuous-time resolution with the coupled spatiotemporal dynamics of the observed graph with edge-addition events; ii) a principled loss composed of task-agnostic TPP posterior maximization based on observed events on the graph, and a task-aware loss with a masking strategy over dynamic graph, where the covered tasks include dynamic link prediction, dynamic node classification and node traffic forecasting; iii) interpretation of the model outputs (e.g., representations and predictions) with scalable perturbation-based quantitative analysis in the graph Fourier domain, which could more comprehensively reflect the behavior of the learned model. Extensive experimental results on public benchmarks show the superior performance of our EasyDGL for time-conditioned predictive tasks, and in particular demonstrate that EasyDGL can effectively quantify the predictive power of frequency content that a model learn from the evolving graph data.

[1]  Junchi Yan,et al.  Graph Signal Sampling for Inductive One-Bit Matrix Completion: a Closed-form Solution , 2023, ICLR.

[2]  Jen-Tzung Chien,et al.  Learning Continuous-Time Dynamics With Attention , 2022, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[3]  Shuiwang Ji,et al.  Explainability in Graph Neural Networks: A Taxonomic Survey , 2020, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[4]  Hongxia Yang,et al.  GraphMAE: Self-Supervised Masked Graph Autoencoders , 2022, KDD.

[5]  Junchi Yan,et al.  Modeling Dynamic User Preference via Dictionary Learning for Sequential Recommendation , 2022, IEEE Transactions on Knowledge and Data Engineering.

[6]  Ross B. Girshick,et al.  Masked Autoencoders Are Scalable Vision Learners , 2021, 2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).

[7]  Eran Yahav,et al.  How Attentive are Graph Attention Networks? , 2021, ICLR.

[8]  Eva L. Dyer,et al.  Large-Scale Representation Learning on Graphs via Bootstrapping , 2021, ICLR.

[9]  Lorenzo Livi,et al.  Graph Neural Networks With Convolutional ARMA Filters , 2019, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[10]  Yitong Ma,et al.  DSTAGNN: Dynamic Spatial-Temporal Aware Graph Neural Network for Traffic Flow Forecasting , 2022, ICML.

[11]  Xiaokang Yang,et al.  Learning Self-Modulating Attention in Continuous Time Space with Applications to Sequential Recommendation , 2022, ICML.

[12]  Carlee Joe-Wong,et al.  GCN-SE: Attention as Explainability for Node Classification in Dynamic Graphs , 2021, 2021 IEEE International Conference on Data Mining (ICDM).

[13]  Jiawei Zhang,et al.  Continuous-Time Sequential Recommendation with Temporal Graph Collaborative Transformer , 2021, CIKM.

[14]  Petko Bogdanov,et al.  Temporal Graph Signal Decomposition , 2021, KDD.

[15]  SeongKu Kang,et al.  Learning Heterogeneous Temporal Patterns of User Preference for Timely Recommendation , 2021, WWW.

[16]  Baochun Li,et al.  Generative Causal Explanations for Graph Neural Networks , 2021, ICML.

[17]  Shuiwang Ji,et al.  On Explainability of Graph Neural Networks via Subgraph Explorations , 2021, ICML.

[18]  Nicholas D. Lane,et al.  Degree-Quant: Quantization-Aware Training for Graph Neural Networks , 2020, ICLR.

[19]  Junchi Yan,et al.  Towards Open-World Recommendation: An Inductive Model-based Collaborative Filtering Approach , 2020, ICML.

[20]  Jian Li,et al.  Kalman Filtering Attention for User Behavior Modeling in CTR Prediction , 2020, NeurIPS.

[21]  Jure Leskovec,et al.  Distance Encoding: Design Provably More Powerful Neural Networks for Graph Representation Learning , 2020, NeurIPS.

[22]  Lina Yao,et al.  Adaptive Graph Convolutional Recurrent Network for Traffic Forecasting , 2020, NeurIPS.

[23]  Davide Eynard,et al.  Temporal Graph Networks for Deep Learning on Dynamic Graphs , 2020, ArXiv.

[24]  Tianlong Chen,et al.  When Does Self-Supervision Help Graph Convolutional Networks? , 2020, ICML.

[25]  Shuiwang Ji,et al.  XGNN: Towards Model-Level Explanations of Graph Neural Networks , 2020, KDD.

[26]  Yuhui Shi,et al.  Continuous-Time Link Prediction via Temporal Dependent Graph Neural Network , 2020, WWW.

[27]  Hongyuan Zha,et al.  Transformer Hawkes Process , 2020, ICML.

[28]  Da Xu,et al.  Inductive Representation Learning on Temporal Graphs , 2020, ICLR.

[29]  Yujie Wang,et al.  Time Interval Aware Self-Attention for Sequential Recommendation , 2020, WSDM.

[30]  Liang Gou,et al.  DySAT: Deep Neural Representation Learning on Dynamic Graphs via Self-Attention Networks , 2020, WSDM.

[31]  Pascal Poupart,et al.  Diachronic Embedding for Temporal Knowledge Graph Completion , 2019, AAAI.

[32]  G. Guo,et al.  Future Data Helps Training: Modeling Future Contexts for Session-based Recommendation , 2019, WWW.

[33]  Karsten M. Borgwardt,et al.  Representation Learning for Dynamic Graphs: A Survey , 2019, J. Mach. Learn. Res..

[34]  Charles E. Leisersen,et al.  EvolveGCN: Evolving Graph Convolutional Networks for Dynamic Graphs , 2019, AAAI.

[35]  Harald Steck,et al.  Markov Random Fields for Collaborative Filtering , 2019, NeurIPS.

[36]  Jure Leskovec,et al.  Predicting Dynamic Embedding Trajectory in Temporal Interaction Networks , 2019, KDD.

[37]  Heiko Hoffmann,et al.  Explainability Methods for Graph Convolutional Neural Networks , 2019, 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).

[38]  Hongyuan Zha,et al.  DyRep: Learning Representations over Dynamic Graphs , 2019, ICLR.

[39]  Peng Jiang,et al.  BERT4Rec: Sequential Recommendation with Bidirectional Encoder Representations from Transformer , 2019, CIKM.

[40]  Ming-Wei Chang,et al.  BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.

[41]  Yuantao Gu,et al.  Spatio-Temporal Signal Recovery Based on Low Rank and Differential Smoothness , 2018, IEEE Transactions on Signal Processing.

[42]  Mathias Niepert,et al.  Learning Sequence Encoders for Temporal Knowledge Graph Completion , 2018, EMNLP.

[43]  Julian J. McAuley,et al.  Self-Attentive Sequential Recommendation , 2018, 2018 IEEE International Conference on Data Mining (ICDM).

[44]  Junjie Wu,et al.  Embedding Temporal Network via Neighborhood Formation , 2018, KDD.

[45]  Yueting Zhuang,et al.  Dynamic Network Embedding by Modeling Triadic Closure Process , 2018, AAAI.

[46]  Ryan A. Rossi,et al.  Continuous-Time Dynamic Network Embeddings , 2018, WWW.

[47]  Hao Ma,et al.  GaAN: Gated Attention Networks for Learning on Large and Spatiotemporal Graphs , 2018, UAI.

[48]  Matthew D. Hoffman,et al.  Variational Autoencoders for Collaborative Filtering , 2018, WWW.

[49]  Pierre Vandergheynst,et al.  Graph Signal Processing: Overview, Challenges, and Applications , 2017, Proceedings of the IEEE.

[50]  Pietro Liò,et al.  Graph Attention Networks , 2017, ICLR.

[51]  Zhanxing Zhu,et al.  Spatio-temporal Graph Convolutional Neural Network: A Deep Learning Framework for Traffic Forecasting , 2017, IJCAI.

[52]  Cyrus Shahabi,et al.  Diffusion Convolutional Recurrent Neural Network: Data-Driven Traffic Forecasting , 2017, ICLR.

[53]  Lukasz Kaiser,et al.  Attention is All you Need , 2017, NIPS.

[54]  Jure Leskovec,et al.  Inductive Representation Learning on Large Graphs , 2017, NIPS.

[55]  Jason Eisner,et al.  The Neural Hawkes Process: A Neurally Self-Modulating Multivariate Point Process , 2016, NIPS.

[56]  Max Welling,et al.  Semi-Supervised Classification with Graph Convolutional Networks , 2016, ICLR.

[57]  Junming Yin,et al.  Scalable Temporal Latent Space Inference for Link Prediction in Dynamic Social Networks , 2017, IEEE Transactions on Knowledge and Data Engineering.

[58]  Xavier Bresson,et al.  Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering , 2016, NIPS.

[59]  Alexandros Karatzoglou,et al.  Session-based Recommendations with Recurrent Neural Networks , 2015, ICLR.

[60]  Jimmy Ba,et al.  Adam: A Method for Stochastic Optimization , 2014, ICLR.

[61]  Gang Niu,et al.  Analysis of Learning from Positive and Unlabeled Data , 2014, NIPS.

[62]  P. Cochat,et al.  Et al , 2008, Archives de pediatrie : organe officiel de la Societe francaise de pediatrie.

[63]  Nathan Halko,et al.  Finding Structure with Randomness: Probabilistic Algorithms for Constructing Approximate Matrix Decompositions , 2009, SIAM Rev..

[64]  Ameet Talwalkar,et al.  Sampling Techniques for the Nystrom Method , 2009, AISTATS.

[65]  O. Aalen,et al.  Survival and Event History Analysis: A Process Point of View , 2008 .

[66]  Inderjit S. Dhillon,et al.  Weighted Graph Cuts without Eigenvectors A Multilevel Approach , 2007, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[67]  James Bennett,et al.  The Netflix Prize , 2007 .

[68]  P. Atzberger The Monte-Carlo Method , 2006 .

[69]  Jeffrey D. Scargle,et al.  An Introduction to the Theory of Point Processes, Vol. I: Elementary Theory and Methods , 2004, Technometrics.

[70]  Jitendra Malik,et al.  Spectral grouping using the Nystrom method , 2004, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[71]  Arnold Neumaier,et al.  Introduction to Numerical Analysis , 2001 .