Overcoming the curse of dimensionality for approximating Lyapunov functions with deep neural networks under a small-gain condition

We propose a deep neural network architecture for storing approximate Lyapunov functions of systems of ordinary differential equations. Under a small-gain condition on the system, the number of neurons needed for an approximation of a Lyapunov function with fixed accuracy grows only polynomially in the state dimension, i.e., the proposed approach is able to overcome the curse of dimensionality.

[1]  George Cybenko,et al.  Approximation by superpositions of a sigmoidal function , 1989, Math. Control. Signals Syst..

[2]  Hiroshi Ito,et al.  On a small gain theorem for ISS networks in dissipative Lyapunov form , 2009, 2009 European Control Conference (ECC).

[3]  Christoph Reisinger,et al.  Rectified deep neural networks overcome the curse of dimensionality for nonsmooth value functions in zero-sum games of nonlinear stiff systems , 2019, Analysis and Applications.

[4]  Kurt Hornik,et al.  Universal approximation of an unknown mapping and its derivatives using multilayer feedforward networks , 1990, Neural Networks.

[5]  Fabian R. Wirth,et al.  Domains of attraction of interconnected systems: A Zubov method approach , 2009, 2009 European Control Conference (ECC).

[6]  Arnulf Jentzen,et al.  Overcoming the Curse of Dimensionality in the Numerical Approximation of Parabolic Partial Differential Equations with Gradient-Dependent Nonlinearities , 2019, Foundations of Computational Mathematics.

[7]  Eduardo Sontag Smooth stabilization implies coprime factorization , 1989, IEEE Transactions on Automatic Control.

[8]  Huyên Pham,et al.  Some machine learning schemes for high-dimensional nonlinear PDEs , 2019, ArXiv.

[9]  Arnulf Jentzen,et al.  Solving high-dimensional partial differential equations using deep learning , 2017, Proceedings of the National Academy of Sciences.

[10]  Justin A. Sirignano,et al.  DGM: A deep learning algorithm for solving partial differential equations , 2017, J. Comput. Phys..

[11]  P. Giesl,et al.  Review on computational methods for Lyapunov functions , 2015 .

[12]  Tingwei Meng,et al.  Overcoming the curse of dimensionality for some Hamilton–Jacobi partial differential equations via neural network architectures , 2019 .

[13]  L. Grüne,et al.  Nonlinear Model Predictive Control : Theory and Algorithms. 2nd Edition , 2011 .

[14]  H. N. Mhaskar,et al.  Neural Networks for Optimal Approximation of Smooth and Analytic Functions , 1996, Neural Computation.

[15]  Fabian R. Wirth,et al.  Small gain theorems for large scale systems and construction of ISS Lyapunov functions , 2009, 2012 IEEE 51st IEEE Conference on Decision and Control (CDC).

[16]  Wolfgang Hahn,et al.  Stability of Motion , 1967 .

[17]  Zhong-Ping Jiang,et al.  Small-gain theorem for ISS systems and applications , 1994, Math. Control. Signals Syst..

[18]  Lorenzo Rosasco,et al.  Why and when can deep-but not shallow-networks avoid the curse of dimensionality: A review , 2016, International Journal of Automation and Computing.

[19]  Kurt Hornik,et al.  Multilayer feedforward networks are universal approximators , 1989, Neural Networks.