GAP: Generalizable Approximate Graph Partitioning Framework

Graph partitioning is the problem of dividing the nodes of a graph into balanced partitions while minimizing the edge cut across the partitions. Due to its combinatorial nature, many approximate solutions have been developed, including variants of multi-level methods and spectral clustering. We propose GAP, a Generalizable Approximate Partitioning framework that takes a deep learning approach to graph partitioning. We define a differentiable loss function that represents the partitioning objective and use backpropagation to optimize the network parameters. Unlike baselines that redo the optimization per graph, GAP is capable of generalization, allowing us to train models that produce performant partitions at inference time, even on unseen graphs. Furthermore, because we learn the representation of the graph while jointly optimizing for the partitioning loss function, GAP can be easily tuned for a variety of graph structures. We evaluate the performance of GAP on graphs of varying sizes and structures, including graphs of widely used machine learning models (e.g., ResNet, VGG, and Inception-V3), scale-free graphs, and random graphs. We show that GAP achieves competitive partitions while being up to 100 times faster than the baseline and generalizes to unseen graphs.

[1]  Jure Leskovec,et al.  Community Structure in Large Networks: Natural Cluster Sizes and the Absence of Large Well-Defined Clusters , 2008, Internet Math..

[2]  Yoshua Bengio,et al.  Understanding the difficulty of training deep feedforward neural networks , 2010, AISTATS.

[3]  Yi Yang,et al.  Balanced k-Means and Min-Cut Clustering , 2014, ArXiv.

[4]  Yilin Zhang,et al.  Understanding Regularized Spectral Clustering via Graph Conductance , 2018, NeurIPS.

[5]  Bo Yang,et al.  Towards K-means-friendly Spaces: Simultaneous Deep Learning and Clustering , 2016, ICML.

[6]  Murray Shanahan,et al.  Deep Unsupervised Clustering with Gaussian Mixture Variational Autoencoders , 2016, ArXiv.

[7]  B. Bollobás The evolution of random graphs , 1984 .

[8]  Geoffrey E. Hinton,et al.  ImageNet classification with deep convolutional neural networks , 2012, Commun. ACM.

[9]  Samy Bengio,et al.  Device Placement Optimization with Reinforcement Learning , 2017, ICML.

[10]  Yan Xu,et al.  hMETIS-Based Offline Road Network Partitioning , 2012, AsiaSim.

[11]  Jure Leskovec,et al.  Inductive Representation Learning on Large Graphs , 2017, NIPS.

[12]  Aric Hagberg,et al.  Exploring Network Structure, Dynamics, and Function using NetworkX , 2008, Proceedings of the Python in Science Conference.

[13]  P. Erdos,et al.  On the evolution of random graphs , 1984 .

[14]  F. Chung Four proofs for the Cheeger inequality and graph partition algorithms , 2007 .

[15]  Hongyi Wu,et al.  Scalable Minimum-Cost Balanced Partitioning of Large-Scale Social Networks: Online and Offline Solutions , 2018, IEEE Transactions on Parallel and Distributed Systems.

[16]  Michael I. Jordan,et al.  On Spectral Clustering: Analysis and an algorithm , 2001, NIPS.

[17]  Huachun Tan,et al.  Variational Deep Embedding: A Generative Approach to Clustering , 2016, ArXiv.

[18]  Joseph Gonzalez,et al.  PowerGraph: Distributed Graph-Parallel Computation on Natural Graphs , 2012, OSDI.

[19]  Max Welling,et al.  Semi-Supervised Classification with Graph Convolutional Networks , 2016, ICLR.

[20]  Ronen Basri,et al.  SpectralNet: Spectral Clustering using Deep Neural Networks , 2018, ICLR.

[21]  Kenneth Steiglitz,et al.  Combinatorial Optimization: Algorithms and Complexity , 1981 .

[22]  Béla Bollobás,et al.  Directed scale-free graphs , 2003, SODA '03.

[23]  Vipin Kumar,et al.  Multilevel k-way hypergraph partitioning , 1999, DAC '99.

[24]  Tomoyuki Obuchi,et al.  Mean-field theory of graph neural networks in graph partitioning , 2018, NeurIPS.

[25]  David E. van den Bout,et al.  Graph partitioning using annealed neural networks , 1990, International 1989 Joint Conference on Neural Networks.

[26]  Xuelong Li,et al.  Balanced Clustering with Least Square Regression , 2017, AAAI.

[27]  Thang D. Bui,et al.  Neural Graph Machines: Learning Neural Networks Using Graphs , 2017, ArXiv.

[28]  KarypisGeorge,et al.  Multilevelk-way Partitioning Scheme for Irregular Graphs , 1998 .

[29]  Ali Farhadi,et al.  Unsupervised Deep Embedding for Clustering Analysis , 2015, ICML.

[30]  Quoc V. Le,et al.  A Hierarchical Model for Device Placement , 2018, ICLR.

[31]  Shashi Shekhar,et al.  Multilevel hypergraph partitioning: applications in VLSI domain , 1999, IEEE Trans. Very Large Scale Integr. Syst..

[32]  Jian Sun,et al.  Deep Residual Learning for Image Recognition , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[33]  Jitendra Malik,et al.  Normalized cuts and image segmentation , 1997, Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[34]  Ulrike von Luxburg,et al.  A tutorial on spectral clustering , 2007, Stat. Comput..

[35]  Sergey Ioffe,et al.  Inception-v4, Inception-ResNet and the Impact of Residual Connections on Learning , 2016, AAAI.

[36]  Andrew Zisserman,et al.  Very Deep Convolutional Networks for Large-Scale Image Recognition , 2014, ICLR.

[37]  Feiping Nie,et al.  A Self-Balanced Min-Cut Algorithm for Image Clustering , 2017, 2017 IEEE International Conference on Computer Vision (ICCV).

[38]  Fan Chung Graham,et al.  Local Graph Partitioning using PageRank Vectors , 2006, 2006 47th Annual IEEE Symposium on Foundations of Computer Science (FOCS'06).

[39]  Huachun Tan,et al.  Variational Deep Embedding: An Unsupervised and Generative Approach to Clustering , 2016, IJCAI.

[40]  Janne Roos,et al.  Using METIS and hMETIS Algorithms in Circuit Partitioning , 2006 .