Computing Lyapunov functions using deep neural networks

We propose a deep neural network architecture and a training algorithm for computing approximate Lyapunov functions of systems of nonlinear ordinary differential equations. Under the assumption that the system admits a compositional Lyapunov function, we prove that the number of neurons needed for an approximation of a Lyapunov function with fixed accuracy grows only polynomially in the state dimension, i.e., the proposed approach is able to overcome the curse of dimensionality. We show that nonlinear systems satisfying a small-gain condition admit compositional Lyapunov functions. Numerical examples in up to ten space dimensions illustrate the performance of the training scheme.

[1]  Sigurdur F. Hafstein,et al.  Continuous and piecewise affine Lyapunov functions using the Yoshizawa construction , 2014, 2014 American Control Conference.

[2]  Christoph Reisinger,et al.  Rectified deep neural networks overcome the curse of dimensionality for nonsmooth value functions in zero-sum games of nonlinear stiff systems , 2019, Analysis and Applications.

[3]  Stavros Petridis,et al.  Construction of Neural Network Based Lyapunov Functions , 2006, The 2006 IEEE International Joint Conference on Neural Network Proceedings.

[4]  P. Giesl,et al.  Review on computational methods for Lyapunov functions , 2015 .

[5]  Antonis Papachristodoulou,et al.  Advances in computational Lyapunov analysis using sum-of-squares programming , 2015 .

[6]  Arnulf Jentzen,et al.  A proof that artificial neural networks overcome the curse of dimensionality in the numerical approximation of Black-Scholes partial differential equations , 2018, Memoirs of the American Mathematical Society.

[7]  Arnulf Jentzen,et al.  Solving high-dimensional partial differential equations using deep learning , 2017, Proceedings of the National Academy of Sciences.

[8]  Hiroshi Ito,et al.  On a small gain theorem for ISS networks in dissipative Lyapunov form , 2009, 2009 European Control Conference (ECC).

[9]  George Cybenko,et al.  Approximation by superpositions of a sigmoidal function , 1989, Math. Control. Signals Syst..

[10]  Zhong-Ping Jiang,et al.  A Lyapunov formulation of the nonlinear small-gain theorem for interconnected ISS systems , 1996, Autom..

[11]  Kurt Hornik,et al.  Multilayer feedforward networks are universal approximators , 1989, Neural Networks.

[12]  P. Giesl Construction of Global Lyapunov Functions Using Radial Basis Functions , 2007 .

[13]  Fabian R. Wirth,et al.  A regularization of Zubov’s equation for robust domains of attraction , 2001 .

[14]  H. N. Mhaskar,et al.  Neural Networks for Optimal Approximation of Smooth and Analytic Functions , 1996, Neural Computation.

[15]  Navid Noroozi,et al.  Generation of Lyapunov Functions by Neural Networks , 2008 .

[16]  A. Fuller,et al.  Stability of Motion , 1976, IEEE Transactions on Systems, Man, and Cybernetics.

[17]  Fabian R. Wirth,et al.  Small gain theorems for large scale systems and construction of ISS Lyapunov functions , 2009, 2012 IEEE 51st IEEE Conference on Decision and Control (CDC).

[18]  Arnulf Jentzen,et al.  Overcoming the Curse of Dimensionality in the Numerical Approximation of Parabolic Partial Differential Equations with Gradient-Dependent Nonlinearities , 2019, Foundations of Computational Mathematics.

[19]  Justin A. Sirignano,et al.  DGM: A deep learning algorithm for solving partial differential equations , 2017, J. Comput. Phys..

[20]  Lorenzo Rosasco,et al.  Why and when can deep-but not shallow-networks avoid the curse of dimensionality: A review , 2016, International Journal of Automation and Computing.

[21]  Huijuan Li Computation of Lyapunov functions and stability of interconnected systems , 2015 .

[22]  M. M. Bayoumi,et al.  Feedback stabilization: control Lyapunov functions modelled by neural networks , 1993, Proceedings of 32nd IEEE Conference on Decision and Control.

[23]  Aude Billard,et al.  Learning control Lyapunov function to ensure stability of dynamical system-based robot reaching motions , 2014, Robotics Auton. Syst..

[24]  Eduardo Sontag Smooth stabilization implies coprime factorization , 1989, IEEE Transactions on Automatic Control.

[25]  Tuan Anh Nguyen,et al.  A proof that rectified deep neural networks overcome the curse of dimensionality in the numerical approximation of semilinear heat equations , 2019, SN Partial Differential Equations and Applications.

[26]  Zhong-Ping Jiang,et al.  Small-gain theorem for ISS systems and applications , 1994, Math. Control. Signals Syst..

[27]  Tingwei Meng,et al.  Overcoming the curse of dimensionality for some Hamilton–Jacobi partial differential equations via neural network architectures , 2019 .

[28]  G. Serpen,et al.  Empirical approximation for Lyapunov functions with artificial neural nets , 2005, Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005..

[29]  Andreas Krause,et al.  The Lyapunov Neural Network: Adaptive Stability Certification for Safe Learning of Dynamical Systems , 2018, CoRL.

[30]  Lars Grune,et al.  Overcoming the curse of dimensionality for approximating Lyapunov functions with deep neural networks under a small-gain condition , 2020, IFAC-PapersOnLine.

[31]  Fabian R. Wirth,et al.  Domains of attraction of interconnected systems: A Zubov method approach , 2009, 2009 European Control Conference (ECC).

[32]  Frank L. Lewis,et al.  Nearly optimal control laws for nonlinear systems with saturating actuators using a neural network HJB approach , 2005, Autom..

[33]  Huyên Pham,et al.  Some machine learning schemes for high-dimensional nonlinear PDEs , 2019, ArXiv.

[34]  C. Hang,et al.  An algorithm for constructing Lyapunov functions based on the variable gradient method , 1970 .