Inductive Representation Learning on Temporal Graphs

Inductive representation learning on temporal graphs is an important step toward salable machine learning on real-world dynamic networks. The evolving nature of temporal dynamic graphs requires handling new nodes as well as capturing temporal patterns. The node embeddings, which are now functions of time, should represent both the static node features and the evolving topological structures. Moreover, node and topological features can be temporal as well, whose patterns the node embeddings should also capture. We propose the temporal graph attention (TGAT) layer to efficiently aggregate temporal-topological neighborhood features as well as to learn the time-feature interactions. For TGAT, we use the self-attention mechanism as building block and develop a novel functional time encoding technique based on the classical Bochner's theorem from harmonic analysis. By stacking TGAT layers, the network recognizes the node embeddings as functions of time and is able to inductively infer embeddings for both new and observed nodes as the graph evolves. The proposed approach handles both node classification and link prediction task, and can be naturally extended to include the temporal edge features. We evaluate our method with transductive and inductive tasks under temporal settings with two benchmark and one industrial dataset. Our TGAT model compares favorably to state-of-the-art baselines as well as the previous temporal graph embedding approaches.

[1]  Quoc V. Le,et al.  Distributed Representations of Sentences and Documents , 2014, ICML.

[2]  Da Xu,et al.  Product Knowledge Graph Embedding for E-commerce , 2019, WSDM.

[3]  Joan Bruna,et al.  Deep Convolutional Networks on Graph-Structured Data , 2015, ArXiv.

[4]  Lukasz Kaiser,et al.  Attention is All you Need , 2017, NIPS.

[5]  Da Xu,et al.  Generative Graph Convolutional Network for Growing Graphs , 2019, ICASSP 2019 - 2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[6]  Razvan Pascanu,et al.  Relational inductive biases, deep learning, and graph networks , 2018, ArXiv.

[7]  Benjamin Recht,et al.  Random Features for Large-Scale Kernel Machines , 2007, NIPS.

[8]  Samy Bengio,et al.  Density estimation using Real NVP , 2016, ICLR.

[9]  Steven Skiena,et al.  DeepWalk: online learning of social representations , 2014, KDD.

[10]  Mingzhe Wang,et al.  LINE: Large-scale Information Network Embedding , 2015, WWW.

[11]  Max Welling,et al.  Variational Graph Auto-Encoders , 2016, ArXiv.

[12]  Dik Lun Lee,et al.  Billion-scale Commodity Embedding for E-commerce Recommendation in Alibaba , 2018, KDD.

[13]  Max Welling,et al.  Semi-Supervised Classification with Graph Convolutional Networks , 2016, ICLR.

[14]  Mohammad Al Hasan,et al.  DyLink2Vec: Effective Feature Representation for Link Prediction in Dynamic Networks , 2018, ArXiv.

[15]  Xavier Bresson,et al.  Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering , 2016, NIPS.

[16]  Yue Wang,et al.  Dynamic Graph CNN for Learning on Point Clouds , 2018, ACM Trans. Graph..

[17]  Hongyuan Zha,et al.  Representation Learning over Dynamic Graphs , 2018, ArXiv.

[18]  Ryan A. Rossi,et al.  Continuous-Time Dynamic Network Embeddings , 2018, WWW.

[19]  Yoshua Bengio,et al.  Neural Machine Translation by Jointly Learning to Align and Translate , 2014, ICLR.

[20]  Jure Leskovec,et al.  Representation Learning on Graphs: Methods and Applications , 2017, IEEE Data Eng. Bull..

[21]  Jonathan Masci,et al.  Geometric Deep Learning on Graphs and Manifolds Using Mixture Model CNNs , 2016, 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[22]  Jan Eric Lenssen,et al.  Fast Graph Representation Learning with PyTorch Geometric , 2019, ArXiv.

[23]  Philip S. Yu,et al.  Deep Dynamic Network Embedding for Link Prediction , 2018, IEEE Access.

[24]  Загоровская Ольга Владимировна,et al.  Исследование влияния пола и психологических характеристик автора на количественные параметры его текста с использованием программы Linguistic Inquiry and Word Count , 2015 .

[25]  Wenwu Zhu,et al.  Structural Deep Network Embedding , 2016, KDD.

[26]  Jure Leskovec,et al.  Graph Convolutional Neural Networks for Web-Scale Recommender Systems , 2018, KDD.

[27]  Nikos Komodakis,et al.  Dynamic Edge-Conditioned Filters in Convolutional Neural Networks on Graphs , 2017, 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[28]  Max Welling,et al.  Auto-Encoding Variational Bayes , 2013, ICLR.

[29]  Jason Weston,et al.  End-To-End Memory Networks , 2015, NIPS.

[30]  L. H. Loomis An Introduction to Abstract Harmonic Analysis , 1953 .

[31]  Jure Leskovec,et al.  Inductive Representation Learning on Large Graphs , 2017, NIPS.

[32]  Shakir Mohamed,et al.  Variational Inference with Normalizing Flows , 2015, ICML.

[33]  Hanna Zijlstra,et al.  Validiteit van de Nederlandse versie van de Linguistic Inquiry and Word Count (liwc) , 2005 .

[34]  Yan Liu,et al.  DynGEM: Deep Embedding Method for Dynamic Graphs , 2018, ArXiv.

[35]  Jiliang Tang,et al.  Streaming Graph Neural Networks , 2018, SIGIR.

[36]  Jure Leskovec,et al.  node2vec: Scalable Feature Learning for Networks , 2016, KDD.

[37]  Da Xu,et al.  Self-attention with Functional Time Representation Learning , 2019, NeurIPS.

[38]  Pietro Liò,et al.  Graph Attention Networks , 2017, ICLR.