Stochastic Graph Neural Networks

Graph neural networks (GNNs) model nonlinear representations in graph data with applications in distributed agent coordination, control, and planning among others. However, current GNN implementations assume ideal distributed scenarios and ignore link fluctuations that occur due to environment or human factors. In these situations, the GNN fails to address its distributed task if the topological randomness is not considered accordingly. To overcome this issue, we put forth the stochastic graph neural network (SGNN) model: a GNN where the distributed graph convolutional operator is modified to account for the network changes. Since stochasticity brings in a new paradigm, we develop a novel learning process for the SGNN and introduce the stochastic gradient descent (SGD) algorithm to estimate the parameters. We prove through the SGD that the SGNN learning process converges to a stationary point under mild Lipschitz assumptions. Numerical simulations corroborate the proposed theory and show an improved performance of the SGNN compared with the conventional GNN when operating over random time varying graphs.

[1]  Pascal Frossard,et al.  The emerging field of signal processing on graphs: Extending high-dimensional data analysis to networks and other irregular domains , 2012, IEEE Signal Processing Magazine.

[2]  Alex Fout,et al.  Protein Interface Prediction using Graph Convolutional Networks , 2017, NIPS.

[3]  Soummya Kar,et al.  Topology adaptive graph convolutional networks , 2017, ArXiv.

[4]  José M. F. Moura,et al.  Discrete Signal Processing on Graphs: Frequency Analysis , 2013, IEEE Transactions on Signal Processing.

[5]  Alejandro Ribeiro,et al.  Predicting Power Outages Using Graph Neural Networks , 2018, 2018 IEEE Global Conference on Signal and Information Processing (GlobalSIP).

[6]  Geert Leus,et al.  Filtering Random Graph Processes Over Random Time-Varying Graphs , 2017, IEEE Transactions on Signal Processing.

[7]  Fernando Gama,et al.  Stability Properties of Graph Neural Networks , 2019, IEEE Transactions on Signal Processing.

[8]  Xing Xie,et al.  Session-based Recommendation with Graph Neural Networks , 2018, AAAI.

[9]  Tom Petersen,et al.  Importance and Exposure in Road Network Vulnerability Analysis , 2006 .

[10]  Shun-ichi Amari,et al.  Backpropagation and stochastic gradient descent method , 1993, Neurocomputing.

[11]  Alán Aspuru-Guzik,et al.  Convolutional Networks on Graphs for Learning Molecular Fingerprints , 2015, NIPS.

[12]  Fernando Gama,et al.  EdgeNets: Edge Varying Graph Neural Networks , 2020, ArXiv.

[13]  Stephen P. Boyd,et al.  Distributed average consensus with least-mean-square deviation , 2007, J. Parallel Distributed Comput..

[14]  Joan Bruna,et al.  Spectral Networks and Locally Connected Networks on Graphs , 2013, ICLR.

[15]  H. Robbins A Stochastic Approximation Method , 1951 .

[16]  Ah Chung Tsoi,et al.  The Graph Neural Network Model , 2009, IEEE Transactions on Neural Networks.

[17]  Jimmy Ba,et al.  Adam: A Method for Stochastic Optimization , 2014, ICLR.

[18]  Pierre Vandergheynst,et al.  Graph Signal Processing: Overview, Challenges, and Applications , 2017, Proceedings of the IEEE.

[19]  Pascal Frossard,et al.  Distributed Signal Processing via Chebyshev Polynomial Approximation , 2011, IEEE Transactions on Signal and Information Processing over Networks.

[20]  Gerhard P. Hancke,et al.  Opportunities and Challenges of Wireless Sensor Networks in Smart Grid , 2010, IEEE Transactions on Industrial Electronics.

[21]  Deborah Estrin,et al.  A wireless sensor network For structural monitoring , 2004, SenSys '04.

[22]  Saeed Ghadimi,et al.  Stochastic First- and Zeroth-Order Methods for Nonconvex Stochastic Programming , 2013, SIAM J. Optim..

[23]  Fernando Gama,et al.  From Graph Filters to Graph Neural Networks , 2020 .

[24]  Jure Leskovec,et al.  Graph Convolutional Neural Networks for Web-Scale Recommender Systems , 2018, KDD.

[25]  Xavier Bresson,et al.  Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering , 2016, NIPS.

[26]  Antonio G. Marques,et al.  Convolutional Neural Network Architectures for Signals Supported on Graphs , 2018, IEEE Transactions on Signal Processing.

[27]  David W. Lewis,et al.  Matrix theory , 1991 .

[28]  Santiago Segarra,et al.  Optimal Graph-Filter Design and Applications to Distributed Linear Network Operators , 2017, IEEE Transactions on Signal Processing.

[29]  Vijay Kumar,et al.  Learning Decentralized Controllers for Robot Swarms with Graph Neural Networks , 2019, CoRL.

[30]  Carl D. Meyer,et al.  Matrix Analysis and Applied Linear Algebra , 2000 .

[31]  José M. F. Moura,et al.  Discrete Signal Processing on Graphs , 2012, IEEE Transactions on Signal Processing.

[32]  Geert Leus,et al.  Advances in Distributed Graph Filtering , 2019, IEEE Transactions on Signal Processing.

[33]  Geert Leus,et al.  Autoregressive Moving Average Graph Filtering , 2016, IEEE Transactions on Signal Processing.

[34]  Fernando Gama,et al.  Graphs, Convolutions, and Neural Networks: From Graph Filters to Graph Neural Networks , 2020, IEEE Signal Processing Magazine.

[35]  A. A. Goldstein,et al.  Optimization of lipschitz continuous functions , 1977, Math. Program..

[36]  D. Chandler,et al.  Introduction To Modern Statistical Mechanics , 1987 .

[37]  An‐Min Zou,et al.  Distributed consensus control for multi‐agent systems using terminal sliding mode and Chebyshev neural networks , 2013 .

[38]  Sanja Fidler,et al.  NerveNet: Learning Structured Policy with Graph Neural Networks , 2018, ICLR.

[39]  Xiao-Ping Zhang,et al.  On the Shift Operator, Graph Frequency, and Optimal Filtering in Graph Signal Processing , 2015, IEEE Transactions on Signal Processing.

[40]  Te-son Kuo,et al.  Trace bounds on the solution of the algebraic matrix Riccati and Lyapunov equation , 1986 .

[41]  Yann LeCun,et al.  Spectral Networks and Deep Locally Connected Networks on Graphs , 2014 .

[42]  Philip S. Yu,et al.  A Comprehensive Survey on Graph Neural Networks , 2019, IEEE Transactions on Neural Networks and Learning Systems.