Absolute stability of analytic neural networks: an approach based on finite trajectory length

A neural network (NN) is said to be convergent (or completely stable) when each trajectory tends to an equilibrium point (a stationary state). A stronger property is that of absolute stability, which means that convergence holds for any choice of the neural network parameters, and any choice of the nonlinear functions, within specified and well characterized sets. In particular, the property of absolute stability requires that the NN be convergent also when, for some parameter values, it possesses nonisolated equilibrium points (e.g., a manifold of equilibria). Such a property, which is really well suited for solving several classes of signal processing tasks in real time, cannot be in general established via the classical LaSalle approach, due to its inherent limitations to study convergence in situations where the NN has nonisolated equilibrium points. A method to address absolute stability is developed, based on proving that the total length of the NN trajectories is finite. A fundamental result on absolute stability is given, under the hypothesis that the NN possesses a Lyapunov function, and the nonlinearities involved (neuron activations, inhibitions, etc.) are modeled by analytic functions. At the core of the proof of finiteness of trajectory length is the use of some basic inequalities for analytic functions due to Lojasiewicz. The result is applicable to a large class of neural networks, which includes the networks proposed by Vidyasagar, the Hopfield neural networks, and the standard cellular NN introduced by Chua and Yang.

[1]  M. Forti,et al.  Necessary and sufficient condition for absolute stability of neural networks , 1994 .

[2]  S. G. Romaniuk A general class of neural networks , 1994, Proceedings of 1994 IEEE International Conference on Neural Networks (ICNN'94).

[3]  P. Hartman Ordinary Differential Equations , 1965 .

[4]  L. Chua,et al.  A more rigorous proof of complete stability of cellular neural networks , 1997 .

[5]  N. G. Parke,et al.  Ordinary Differential Equations. , 1958 .

[6]  Jun Wang,et al.  Absolute exponential stability of a class of continuous-time recurrent neural networks , 2003, IEEE Trans. Neural Networks.

[7]  Hsiao-Dong Chiang,et al.  Theory of stability regions for a class of nonhyperbolic dynamical systems and its application to constraint satisfaction problems , 2002 .

[8]  S. Łojasiewicz Ensembles semi-analytiques , 1965 .

[9]  Bing J. Sheu,et al.  Search of optimal solutions in multi-level neural networks , 1994, Proceedings of IEEE International Symposium on Circuits and Systems - ISCAS '94.

[10]  Stephen Grossberg,et al.  Absolute stability of global pattern formation and parallel memory storage by competitive neural networks , 1983, IEEE Transactions on Systems, Man, and Cybernetics.

[11]  J. Hale,et al.  Ordinary Differential Equations , 2019, Fundamentals of Numerical Mathematics for Physicists and Engineers.

[12]  J. Palis,et al.  Geometric theory of dynamical systems , 1982 .

[13]  A. Tesi,et al.  New conditions for global stability of neural networks with application to linear and quadratic programming problems , 1995 .

[14]  Jun Wang,et al.  An additive diagonal-stability condition for absolute exponential stability of a general class of neural networks , 2001 .

[15]  Leon O. Chua,et al.  Neural networks for nonlinear programming , 1988 .

[16]  Liang Jin,et al.  Absolute stability conditions for discrete-time recurrent neural networks , 1994, IEEE Trans. Neural Networks.

[17]  Wolfgang Porod,et al.  Qualitative analysis and synthesis of a class of neural networks , 1988 .

[18]  J J Hopfield,et al.  Neurons with graded response have collective computational properties like those of two-state neurons. , 1984, Proceedings of the National Academy of Sciences of the United States of America.

[19]  S. Łojasiewicz Sur le problème de la division , 1959 .

[20]  Lin-Bao Yang,et al.  Cellular neural networks: theory , 1988 .

[21]  Michael A. Shanblatt,et al.  Linear and quadratic programming neural network analysis , 1992, IEEE Trans. Neural Networks.

[22]  Alberto Tesi,et al.  A New Method to Analyze Complete stability of PWL Cellular Neural Networks , 2001, Int. J. Bifurc. Chaos.

[23]  Mathukumalli Vidyasagar Minimum-seeking properties of analog neural networks with multilinear objective functions , 1995, IEEE Trans. Autom. Control..

[24]  Stephen Grossberg,et al.  Nonlinear neural networks: Principles, mechanisms, and architectures , 1988, Neural Networks.

[25]  Morris W. Hirsch,et al.  Convergent activation dynamics in continuous time networks , 1989, Neural Networks.

[26]  R. A. Silverman,et al.  Introductory Real Analysis , 1972 .