Stability of Manifold Neural Networks to Deformations

Stability is an important property of graph neural networks (GNNs) which explains their success in many problems of practical interest. Existing GNN stability results depend on the size of the graph, restricting applicability to graphs of moderate size. To understand the stability properties of GNNs on large graphs, we consider neural networks supported on manifolds. These are defined in terms of manifold diffusions mediated by the Laplace-Beltrami (LB) operator and are interpreted as limits of GNNs running on graphs of growing size. We define manifold deformations and show that they lead to perturbations of the manifold’s LB operator that consist of an absolute and a relative perturbation term. We then define filters that split the infinite dimensional spectrum of the LB operator in finite partitions, and prove that manifold neural networks (MNNs) with these filters are stable to both, absolute and relative perturbations of the LB operator. Stability results are illustrated numerically in resource allocation problems in wireless networks.

[1]  S. Rosenberg The Laplacian on a Riemannian Manifold: An Introduction to Analysis on Manifolds , 1997 .

[2]  Santiago Segarra,et al.  Unfolding WMMSE Using Graph Neural Networks for Efficient Power Allocation , 2021, IEEE Transactions on Wireless Communications.

[3]  Zhi-Li Zhang,et al.  Stability and Generalization of Graph Convolutional Neural Networks , 2019, KDD.

[4]  Xing Xie,et al.  Session-based Recommendation with Graph Neural Networks , 2018, AAAI.

[5]  Mark Eisen,et al.  Unsupervised Learning for Asynchronous Resource Allocation In Ad-Hoc Wireless Networks , 2020, ICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[6]  Pierre Vandergheynst,et al.  Geometric Deep Learning: Going beyond Euclidean data , 2016, IEEE Signal Process. Mag..

[7]  Gilad Lerman,et al.  Graph Convolutional Neural Networks via Scattering , 2018, Applied and Computational Harmonic Analysis.

[8]  Antonio G. Marques,et al.  Convolutional Neural Network Architectures for Signals Supported on Graphs , 2018, IEEE Transactions on Signal Processing.

[9]  Fernando Gama,et al.  Stability of Graph Scattering Transforms , 2019, NeurIPS.

[10]  Yuan He,et al.  Graph Neural Networks for Social Recommendation , 2019, WWW.

[11]  Stéphane Mallat,et al.  Group Invariant Scattering , 2011, ArXiv.

[12]  Fernando Gama,et al.  Graph Neural Networks: Architectures, Stability, and Transferability , 2020, Proceedings of the IEEE.

[13]  Fernando Gama,et al.  Stability Properties of Graph Neural Networks , 2019, IEEE Transactions on Signal Processing.

[14]  Wolfgang Arendt,et al.  Weyl's Law: Spectral Properties of the Laplacian in Mathematics and Physics , 2009 .

[15]  Zhi-Quan Luo,et al.  An iteratively weighted MMSE approach to distributed sum-utility maximization for a MIMO interfering broadcast channel , 2011, 2011 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[16]  Ah Chung Tsoi,et al.  The Graph Neural Network Model , 2009, IEEE Transactions on Neural Networks.