HONEM: Network Embedding Using Higher-Order Patterns in Sequential Data

Representation learning offers a powerful alternative to the oft painstaking process of manual feature engineering, and as a result, has enjoyed considerable success in recent years. This success is especially striking in the context of graph mining, since networks can take advantage of vast troves of sequential data to encode information about interactions between entities of interest. But how do we learn embeddings on networks that have higher-order and sequential dependencies? Existing network embedding methods naively assume the Markovian property (first-order dependency) for node interactions, which may not capture the time-dependent and longer-range underlying complex interactions of the raw data. To address the limitation of current methods, we propose a network embedding method for higher-order networks (HON). We demonstrate that the higher-order network embedding (HONEM) method is able to extract higher-order dependencies from HON to construct the higher-order neighborhood matrix of the network, while existing methods are not able to capture these higher-order dependencies. We show that our method outperforms other state-of-the-art methods in node classification, network reconstruction, link prediction, and visualization.

[1]  Max Welling,et al.  Semi-Supervised Classification with Graph Convolutional Networks , 2016, ICLR.

[2]  R. A. Leibler,et al.  On Information and Sufficiency , 1951 .

[3]  Yueting Zhuang,et al.  Dynamic Network Embedding by Modeling Triadic Closure Process , 2018, AAAI.

[4]  Jian Pei,et al.  Asymmetric Transitivity Preserving Graph Embedding , 2016, KDD.

[5]  Martin Rosvall,et al.  Memory in network flows and its effects on spreading dynamics and community detection , 2013, Nature Communications.

[6]  Junjie Wu,et al.  Embedding Temporal Network via Neighborhood Formation , 2018, KDD.

[7]  Qiongkai Xu,et al.  GraRep: Learning Graph Representations with Global Structural Information , 2015, CIKM.

[8]  Mikhail Belkin,et al.  Laplacian Eigenmaps for Dimensionality Reduction and Data Representation , 2003, Neural Computation.

[9]  Steven Skiena,et al.  DeepWalk: online learning of social representations , 2014, KDD.

[10]  Wenwu Zhu,et al.  Structural Deep Network Embedding , 2016, KDD.

[11]  Qiaozhu Mei,et al.  PTE: Predictive Text Embedding through Large-scale Heterogeneous Text Networks , 2015, KDD.

[12]  Palash Goyal,et al.  Graph Embedding Techniques, Applications, and Performance: A Survey , 2017, Knowl. Based Syst..

[13]  S T Roweis,et al.  Nonlinear dimensionality reduction by locally linear embedding. , 2000, Science.

[14]  Jure Leskovec,et al.  node2vec: Scalable Feature Learning for Networks , 2016, KDD.

[15]  Geoffrey E. Hinton,et al.  Visualizing Data using t-SNE , 2008 .

[16]  Jure Leskovec,et al.  Inductive Representation Learning on Large Graphs , 2017, NIPS.

[17]  Steven Skiena,et al.  Walklets: Multiscale Graph Embeddings for Interpretable Network Classification , 2016, ArXiv.

[18]  Graham K. Rand,et al.  Quantitative Applications in the Social Sciences , 1983 .

[19]  Ingo Scholtes,et al.  Understanding Complex Systems: From Networks to Optimal Higher-Order Models , 2018, ArXiv.

[20]  Ryan A. Rossi,et al.  Higher-order Network Representation Learning , 2018, WWW.

[21]  Joan Bruna,et al.  Deep Convolutional Networks on Graph-Structured Data , 2015, ArXiv.

[22]  Nitesh V. Chawla,et al.  Representing higher-order dependencies in networks , 2015, Science Advances.

[23]  Jure Leskovec,et al.  Human wayfinding in information networks , 2012, WWW.

[24]  Jian Li,et al.  Network Embedding as Matrix Factorization: Unifying DeepWalk, LINE, PTE, and node2vec , 2017, WSDM.

[25]  Wenwu Zhu,et al.  DepthLGP: Learning Embeddings of Out-of-Sample Nodes in Dynamic Networks , 2018, AAAI.

[26]  Paul Geladi,et al.  Principal Component Analysis , 1987, Comprehensive Chemometrics.

[27]  Richard S. Zemel,et al.  Gated Graph Sequence Neural Networks , 2015, ICLR.

[28]  C. Eckart,et al.  The approximation of one matrix by another of lower rank , 1936 .

[29]  Jure Leskovec,et al.  Local Higher-Order Graph Clustering , 2017, KDD.

[30]  Ingo Scholtes,et al.  Causality-driven slow-down and speed-up of diffusion in non-Markovian temporal networks , 2013, Nature Communications.

[31]  Wei Lu,et al.  Deep Neural Networks for Learning Graph Representations , 2016, AAAI.

[32]  Mingzhe Wang,et al.  LINE: Large-scale Information Network Embedding , 2015, WWW.

[33]  Hao Yu,et al.  MOHONE: Modeling Higher Order Network Effects in KnowledgeGraphs via Network Infused Embeddings , 2018, ArXiv.

[34]  Ingo Scholtes,et al.  When is a Network a Network?: Multi-Order Graphical Model Selection in Pathways and Temporal Networks , 2017, KDD.

[35]  Jian Pei,et al.  Arbitrary-Order Proximity Preserved Network Embedding , 2018, KDD.

[36]  Jeffrey Dean,et al.  Distributed Representations of Words and Phrases and their Compositionality , 2013, NIPS.

[37]  Alexander J. Smola,et al.  Distributed large-scale natural graph factorization , 2013, WWW.