A Recurrent Graph Neural Network for Multi-relational Data

The era of "data deluge" has sparked the interest in graph-based learning methods in a number of disciplines such as sociology, biology, neuroscience, or engineering. In this paper, we introduce a graph recurrent neural network (GRNN) for scalable semi-supervised learning from multi-relational data. Key aspects of the novel GRNN architecture are the use of multi-relational graphs, the dynamic adaptation to the different relations via learnable weights, and the consideration of graph-based regularizers to promote smoothness and alleviate over-parametrization. Our ultimate goal is to design a powerful learning architecture able to: discover complex and highly non-linear data associations, combine (and select) multiple types of relations, and scale gracefully with respect to the size of the graph. Numerical tests with real datasets corroborate the design goals and illustrate the performance gains relative to competing alternatives.

[1]  Mikhail Belkin,et al.  Manifold Regularization: A Geometric Framework for Learning from Labeled and Unlabeled Examples , 2006, J. Mach. Learn. Res..

[2]  Pascal Frossard,et al.  The emerging field of signal processing on graphs: Extending high-dimensional data analysis to networks and other irregular domains , 2012, IEEE Signal Processing Magazine.

[3]  Alejandro Ribeiro,et al.  CONVOLUTIONAL NEURAL NETWORKS VIA NODE-VARYING GRAPH FILTERS , 2017, 2018 IEEE Data Science Workshop (DSW).

[4]  Yu-Bin Yang,et al.  Image Restoration Using Convolutional Auto-encoders with Symmetric Skip Connections , 2016, ArXiv.

[5]  Georgios B. Giannakis,et al.  Adaptive Diffusions for Scalable Learning Over Graphs , 2018, IEEE Transactions on Signal Processing.

[6]  Xavier Bresson,et al.  Structured Sequence Modeling with Graph Convolutional Recurrent Networks , 2016, ICONIP.

[7]  Alejandro Ribeiro,et al.  Median Activation Functions for Graph Neural Networks , 2019, ICASSP 2019 - 2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[8]  Jason Weston,et al.  Deep learning via semi-supervised embedding , 2008, ICML '08.

[9]  Mason A. Porter,et al.  Multilayer networks , 2013, J. Complex Networks.

[10]  Jimmy Ba,et al.  Adam: A Method for Stochastic Optimization , 2014, ICLR.

[11]  Georgios B. Giannakis,et al.  Kernel-Based Semi-Supervised Learning Over Multilayer Graphs , 2018, 2018 IEEE 19th International Workshop on Signal Processing Advances in Wireless Communications (SPAWC).

[12]  Antonio G. Marques,et al.  Convolutional Neural Network Architectures for Signals Supported on Graphs , 2018, IEEE Transactions on Signal Processing.

[13]  Zoubin Ghahramani,et al.  Combining active learning and semi-supervised learning using Gaussian fields and harmonic functions , 2003, ICML 2003.

[14]  Pierre Vandergheynst,et al.  Geometric Deep Learning: Going beyond Euclidean data , 2016, IEEE Signal Process. Mag..

[15]  Yuan Yu,et al.  TensorFlow: A system for large-scale machine learning , 2016, OSDI.

[16]  Max Welling,et al.  Semi-Supervised Classification with Graph Convolutional Networks , 2016, ICLR.

[17]  Georgios B. Giannakis,et al.  Kernel-based Inference of Functions over Graphs , 2017, ArXiv.

[18]  Santiago Segarra,et al.  Sampling of Graph Signals With Successive Local Aggregations , 2015, IEEE Transactions on Signal Processing.

[19]  Xavier Bresson,et al.  Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering , 2016, NIPS.

[20]  Lise Getoor,et al.  Collective Classification in Network Data , 2008, AI Mag..

[21]  Ruslan Salakhutdinov,et al.  Revisiting Semi-Supervised Learning with Graph Embeddings , 2016, ICML.

[22]  Stanley Wasserman,et al.  Social Network Analysis: Methods and Applications , 1994, Structural analysis in the social sciences.

[23]  Geoffrey E. Hinton,et al.  Learning representations by back-propagating errors , 1986, Nature.