SUSAN: The Structural Similarity Random Walk Kernel

Random walk kernels are a very flexible family of graph kernels, in which we can incorporate edge and vertex similarities through positive definite kernels. In this work we study the particular case within this family in which the vertex kernel has bounded support. We motivate this property as the configurable flexibility in terms of vertex alignment between the two graphs on which the walk is performed. We study several fast and intuitive ways to derive structurally aware labels and combine them with such a vertex kernel, which in turn is incorporated in the random walk kernel. We provide a fast algorithm to compute the resulting random walk kernel and we give precise bounds on its computational complexity. We show that this complexity always remains upper bounded by that of alternative methods in the literature and study conditions under which this advantage can be significantly higher. We evaluate the resulting configurations on their predictive performance on several families of graphs and show significant improvements against the vanilla random walk kernel and other competing algorithms.

[1]  Karsten M. Borgwardt,et al.  Wasserstein Weisfeiler-Lehman Graph Kernels , 2019, NeurIPS.

[2]  Nils M. Kriege,et al.  A survey on graph kernels , 2019, Applied Network Science.

[3]  S. V. N. Vishwanathan,et al.  Graph kernels , 2007 .

[4]  Stephen B. Seidman,et al.  Network structure and minimum degree , 1983 .

[5]  Abhijin Adiga,et al.  How Robust Is the Core of a Network? , 2013, ECML/PKDD.

[6]  Hans-Peter Kriegel,et al.  Protein function prediction via graph kernels , 2005, ISMB.

[7]  Michalis Vazirgiannis,et al.  Random Walk Graph Neural Networks , 2020, NeurIPS.

[8]  Jilles Vreeken,et al.  Discovering Robustly Connected Subgraphs with Simple Descriptions , 2019, 2019 IEEE International Conference on Data Mining (ICDM).

[9]  P. M. Weichsel THE KRONECKER PRODUCT OF GRAPHS , 1962 .

[10]  Chih-Jen Lin,et al.  LIBLINEAR: A Library for Large Linear Classification , 2008, J. Mach. Learn. Res..

[11]  Awad H. Al-Mohy,et al.  Computing the Action of the Matrix Exponential, with an Application to Exponential Integrators , 2011, SIAM J. Sci. Comput..

[12]  Kristian Kersting,et al.  TUDataset: A collection of benchmark datasets for learning with graphs , 2020, ArXiv.

[13]  Thomas Gärtner,et al.  On Graph Kernels: Hardness Results and Efficient Alternatives , 2003, COLT.

[14]  Stefan Wrobel,et al.  A generalized Weisfeiler-Lehman graph kernel , 2021, Machine Learning.

[15]  David Haussler,et al.  Convolution kernels on discrete structures , 1999 .

[16]  Michalis Vazirgiannis,et al.  A Degeneracy Framework for Graph Similarity , 2018, IJCAI.

[17]  C. Loan The ubiquitous Kronecker product , 2000 .

[18]  S. V. N. Vishwanathan,et al.  Fast Computation of Graph Kernels , 2006, NIPS.

[19]  Jimeng Sun,et al.  Fast Random Walk Graph Kernel , 2012, SDM.

[20]  Chengqi Zhang,et al.  Task Sensitive Feature Exploration and Learning for Multitask Graph Classification , 2017, IEEE Transactions on Cybernetics.

[21]  Tatsuya Akutsu,et al.  Extensions of marginalized graph kernels , 2004, ICML.