HNHN: Hypergraph Networks with Hyperedge Neurons

Hypergraphs provide a natural representation for many real world datasets. We propose a novel framework, HNHN, for hypergraph representation learning. HNHN is a hypergraph convolution network with nonlinear activation functions applied to both hypernodes and hyperedges, combined with a normalization scheme that can flexibly adjust the importance of high-cardinality hyperedges and high-degree vertices depending on the dataset. We demonstrate improved performance of HNHN in both classification accuracy and speed on real world datasets when compared to state of the art methods.

[1]  Lise Getoor,et al.  Collective entity resolution in relational data , 2007, TKDD.

[2]  Martine D. F. Schlag,et al.  Multi-level spectral hypergraph partitioning with arbitrary vertex sizes , 1996, Proceedings of International Conference on Computer Aided Design.

[3]  Jieping Ye,et al.  Hypergraph spectral learning for multi-label classification , 2008, KDD.

[4]  Yoshua Bengio,et al.  Deep Sparse Rectifier Neural Networks , 2011, AISTATS.

[5]  Ruochi Zhang,et al.  Hyper-SAGNN: a self-attention based graph neural network for hypergraphs , 2019, ICLR.

[6]  Richard A. Harshman,et al.  Indexing by Latent Semantic Analysis , 1990, J. Am. Soc. Inf. Sci..

[7]  Max Welling,et al.  Semi-Supervised Classification with Graph Convolutional Networks , 2016, ICLR.

[8]  Lise Getoor,et al.  Link-Based Classification , 2003, Encyclopedia of Machine Learning and Data Mining.

[9]  Lise Getoor,et al.  Collective Classification in Network Data , 2008, AI Mag..

[10]  Shmuel Friedland,et al.  On the First Eigenvalue of Bipartite Graphs , 2008, Electron. J. Comb..

[11]  Lise Getoor,et al.  Query-driven Active Surveying for Collective Classification , 2012 .

[12]  Hyeran Jeon,et al.  Graph processing on GPUs: Where are the bottlenecks? , 2014, 2014 IEEE International Symposium on Workload Characterization (IISWC).

[13]  Samuel S. Schoenholz,et al.  Neural Message Passing for Quantum Chemistry , 2017, ICML.

[14]  Serge J. Belongie,et al.  Higher order learning with graphs , 2006, ICML.

[15]  Lee M. Gunderson,et al.  A Unifying Framework for Spectrum-Preserving Graph Sparsification and Coarsening , 2019, NeurIPS.

[16]  Fei Wang,et al.  Structural Deep Embedding for Hyper-Networks , 2017, AAAI.

[17]  T.-H. Hubert Chan,et al.  Generalizing the Hypergraph Laplacian via a Diffusion Process with Mediators , 2018, COCOON.

[18]  Bernhard Schölkopf,et al.  Learning with Hypergraphs: Clustering, Classification, and Embedding , 2006, NIPS.

[19]  BhattacharyaIndrajit,et al.  Collective entity resolution in relational data , 2007 .

[20]  Yixin Chen,et al.  Beyond Link Prediction: Predicting Hyperlinks in Adjacency Space , 2018, AAAI.

[21]  Vijay S. Pande,et al.  Molecular graph convolutions: moving beyond fingerprints , 2016, Journal of Computer-Aided Molecular Design.

[22]  Song Bai,et al.  Hypergraph Convolution and Hypergraph Attention , 2019, Pattern Recognit..

[23]  Partha Pratim Talukdar,et al.  HyperGCN: A New Method of Training Graph Convolutional Networks on Hypergraphs , 2018 .

[24]  Jimmy Ba,et al.  Adam: A Method for Stochastic Optimization , 2014, ICLR.

[25]  Nitish Srivastava,et al.  Dropout: a simple way to prevent neural networks from overfitting , 2014, J. Mach. Learn. Res..

[26]  L. Getoor,et al.  Link-Based Classification , 2003, Encyclopedia of Machine Learning and Data Mining.