In this work we generalize traditional node/link prediction tasks in dynamic heterogeneous networks, to consider joint prediction over larger k-node induced subgraphs. Our key insight is to incorporate the unavoidable dependencies in the training observations of induced subgraphs into both the input features and the model architecture itself via high-order dependencies. The strength of the representation is its invariance to isomorphisms and varying local neighborhood sizes, while still being able to take node/edge labels into account, and facilitating inductive reasoning (i.e., generalization to unseen portions of the network). Empirical results show that our proposed method significantly outperforms other stateof-the-art methods designed for static and/or single node/link prediction tasks. In addition, we show that our method is scalable and learns interpretable parameters. Introduction Learning predictive models of heterogeneous relational and network data is a fundamental task in machine learning and data mining (Getoor and Mihalkova 2011; Lao and Cohen 2010; Lin et al. 2015; Grover and Leskovec 2016; Nickel, Rosasco, and Poggio 2016). Much of the work in heterogeneous networks (graphs with node and edge labels) has focused on developing methods for label prediction or single link prediction. There has been relatively little development in methods that make joint predictions over larger substructures (e.g., induced k-node subgraphs). Recent research has shown rich higher-order organization of such networks (Benson, Gleich, and Leskovec 2016; Xu, Wickramarathne, and Chawla 2016) and complex subgraph evolution patterns within larger graphs (Paranjape, Benson, and Leskovec 2017). Applications range from predicting group activity on social networks (e.g., online social network ad revenues rely heavily on user activity), computational social science (e.g., predicting the dynamics of groups and their social relationships), relational learning (e.g., find missing and predicting future joint relationships in knowledge graphs). The main challenge in learning a model to predict the evolution of labeled subgraphs is to jointly account for the induced subgraph dependencies that emerge from subgraphs Copyright c © 2018, Association for the Advancement of Artificial Intelligence (www.aaai.org). All rights reserved. sharing edges. Unlike node and edge prediction tasks, it is not clear how to describe an approximate model that can account for these dependencies. A variety of recent methods have developed heuristics to encode joint label and structure information into low dimensional node or edge embeddings, but it is unclear how these ad-hoc methods can properly address the induced subgraph dependencies (Lao and Cohen 2010; Nickel, Tresp, and Kriegel 2011; Dong et al. 2014; Lin et al. 2015; Grover and Leskovec 2016; Atwood and Towsley 2016; Nickel, Rosasco, and Poggio 2016; Rahman and Al Hasan 2016). Our empirical results show that these methods tend to perform poorly in induced subgraph prediction tasks. The task of predicting induced subgraph evolution requires an approach that can take into account higher-order dependencies between the induced subgraphs (due to their shared edges and non-edges1). Our two main contributions are: (1) We target the evolution of larger graph structures than nodes and edges, which, to the best of our knowledge, has never been focused before. Traditional link prediction tasks are simpler special cases of our task. (2) We incorporate the unavoidable dependencies within the training observations of induced subgraphs into both the input features and the model architecture itself via high-order dependencies. We denote our model architecture a Subgraph Pattern Neural Network (SPNN) and show that its strength is due to a representation that is invariant to isomorphisms and varying local neighborhood sizes, can also take node/edge labels into account, and which facilitates inductive reasoning. SPNN is a discriminative feedforward neural network with hidden layers that represent the dependent subgraph patterns observed in the training data. The input features of SPNN extend the definition of induced isomorphism density (Lovász and Szegedy 2006) to a local graph neighborhood in a way that accounts for joint edges and nonedges in the induced subgraphs. Moreover, SPNN is inductive (it can be applied to unseen portions of the graph), and is isomorphic-invariant, such the learned model is invariant to node permutations. We also show that SPNN learns to predict using an interpretable neural network structure. A non-edge marks the absence of an edge The Thirty-Second AAAI Conference on Artificial Intelligence (AAAI-18)
[1]
Jure Leskovec,et al.
Motifs in Temporal Networks
,
2016,
WSDM.
[2]
Lada A. Adamic,et al.
Friends and neighbors on the Web
,
2003,
Soc. Networks.
[3]
Cristina E. Manfredotti,et al.
Modeling and Inference with Relational Dynamic Bayesian Networks
,
2009,
Canadian AI.
[4]
V. Sós,et al.
Convergent Sequences of Dense Graphs I: Subgraph Frequencies, Metric Properties and Testing
,
2007,
math/0702004.
[5]
Jennifer Neville,et al.
Relational Dependency Networks
,
2007,
J. Mach. Learn. Res..
[6]
Donald F. Towsley,et al.
Efficiently Estimating Motif Statistics of Large Networks
,
2013,
TKDD.
[7]
Evgeniy Gabrilovich,et al.
A Review of Relational Machine Learning for Knowledge Graphs
,
2015,
Proceedings of the IEEE.
[8]
Mathias Niepert,et al.
Learning Convolutional Neural Networks for Graphs
,
2016,
ICML.
[9]
S. V. N. Vishwanathan,et al.
A Structural Smoothing Framework For Robust Graph Comparison
,
2015,
NIPS.
[10]
Jure Leskovec,et al.
Higher-order organization of complex networks
,
2016,
Science.
[11]
Luc De Raedt,et al.
Graph Invariant Kernels
,
2015,
IJCAI.
[12]
Mohammad Al Hasan,et al.
Link Prediction in Dynamic Networks Using Graphlet
,
2016,
ECML/PKDD.
[13]
Joan Bruna,et al.
Spectral Networks and Locally Connected Networks on Graphs
,
2013,
ICLR.
[14]
Wei Wang,et al.
Efficient mining of frequent subgraphs in the presence of isomorphism
,
2003,
Third IEEE International Conference on Data Mining.
[15]
Wei Zhang,et al.
Knowledge vault: a web-scale approach to probabilistic knowledge fusion
,
2014,
KDD.
[16]
Le Song,et al.
Discriminative Embeddings of Latent Variable Models for Structured Data
,
2016,
ICML.
[17]
Larry P. Heck,et al.
Deep learning of knowledge graph embeddings for semantic parsing of Twitter dialogs
,
2014,
2014 IEEE Global Conference on Signal and Information Processing (GlobalSIP).
[18]
Ni Lao,et al.
Relational retrieval using a combination of path-constrained random walks
,
2010,
Machine Learning.
[19]
Nitesh V. Chawla,et al.
New perspectives and methods in link prediction
,
2010,
KDD.
[20]
Zhiyuan Liu,et al.
Learning Entity and Relation Embeddings for Knowledge Graph Completion
,
2015,
AAAI.
[21]
Lise Getoor,et al.
Learning statistical models from relational data
,
2011,
SIGMOD '11.
[22]
Mingzhe Wang,et al.
LINE: Large-scale Information Network Embedding
,
2015,
WWW.
[23]
László Lovász,et al.
Limits of dense graph sequences
,
2004,
J. Comb. Theory B.
[24]
Joan Bruna,et al.
Deep Convolutional Networks on Graph-Structured Data
,
2015,
ArXiv.
[25]
Charu C. Aggarwal,et al.
Co-author Relationship Prediction in Heterogeneous Bibliographic Networks
,
2011,
2011 International Conference on Advances in Social Networks Analysis and Mining.
[26]
Nitesh V. Chawla,et al.
Representing higher-order dependencies in networks
,
2015,
Science Advances.
[27]
Matthew Richardson,et al.
Markov logic networks
,
2006,
Machine Learning.
[28]
Philip S. Yu,et al.
PathSim
,
2011,
Proc. VLDB Endow..
[29]
Steven Skiena,et al.
DeepWalk: online learning of social representations
,
2014,
KDD.
[30]
Donald F. Towsley,et al.
Diffusion-Convolutional Neural Networks
,
2015,
NIPS.
[31]
Pinar Yanardag,et al.
Deep Graph Kernels
,
2015,
KDD.
[32]
Jure Leskovec,et al.
node2vec: Scalable Feature Learning for Networks
,
2016,
KDD.
[33]
Hans-Peter Kriegel,et al.
A Three-Way Model for Collective Learning on Multi-Relational Data
,
2011,
ICML.
[34]
David Liben-Nowell,et al.
The link-prediction problem for social networks
,
2007
.
[35]
Lorenzo Rosasco,et al.
Holographic Embeddings of Knowledge Graphs
,
2015,
AAAI.