Stability of fully asynchronous discrete-time discrete-state dynamic networks

We consider networks of a large number of neurons (or units, processors, ...), whose dynamics are fully asynchronous with overlapping updating. We suppose that the neurons take a finite number of states (discrete states), and that the updating scheme is discrete in time. We make no hypotheses on the activation function of the neurons; the networks may have multiple cycles and basins. We derive conditions on the initialization of the networks, which ensures convergence to fixed points only. Application to a fully asynchronous Hopfield neural network allows us to validate our study.

[1]  Eugenius Kaszkurewicz,et al.  A Global Asymptotic Stability Result for a Class of Totally Asynchronous Discrete Nonlinear Systems , 1999, Math. Control. Signals Syst..

[2]  Pascal Koiran Dynamics of Discrete Time, Continuous State Hopfield Networks , 1994, Neural Computation.

[3]  Herz,et al.  Distributed dynamics in neural networks. , 1993, Physical review. E, Statistical physics, plasmas, fluids, and related interdisciplinary topics.

[4]  David J. Evans,et al.  Neural network software simulation , 1999, Int. J. Comput. Math..

[5]  Lipo Wang,et al.  On the dynamics of discrete-time, continuous-state Hopfield neural networks , 1998 .

[6]  J J Hopfield,et al.  Neural networks and physical systems with emergent collective computational abilities. , 1982, Proceedings of the National Academy of Sciences of the United States of America.

[7]  Xin Wang,et al.  Absence of Cycles in Symmetric Neural Networks , 1998, Neural Computation.

[8]  Eugenius Kaszkurewicz,et al.  Existence and stability of a unique equilibrium in continuous-valued discrete-time asynchronous Hopfield neural networks , 1996, IEEE Trans. Neural Networks.

[9]  Jacques M. Bahi,et al.  Convergence of discrete asynchronous iterations , 2000, Int. J. Comput. Math..

[10]  Jehoshua Bruck On the convergence properties of the Hopfield model , 1990, Proc. IEEE.

[11]  M. Tarazi Some convergence results for asynchronous algorithms , 1982 .

[12]  Eric Goles Ch.,et al.  Decreasing energy functions as a tool for studying threshold networks , 1985, Discret. Appl. Math..

[13]  Gérard M. Baudet,et al.  Asynchronous Iterative Methods for Multiprocessors , 1978, JACM.

[14]  Anthony N. Michel,et al.  Analysis and synthesis techniques for Hopfield type synchronous discrete time neural networks with application to associative memory , 1990 .

[15]  J J Hopfield,et al.  Neurons with graded response have collective computational properties like those of two-state neurons. , 1984, Proceedings of the National Academy of Sciences of the United States of America.

[16]  Sudhakar M. Reddy,et al.  Guaranteed convergence in a class of Hopfield networks , 1992, IEEE Trans. Neural Networks.

[17]  John N. Tsitsiklis,et al.  Parallel and distributed computation , 1989 .

[18]  R. Westervelt,et al.  Dynamics of iterated-map neural networks. , 1989, Physical review. A, General physics.

[19]  J. Goodman,et al.  Neural networks for computation: number representations and programming complexity. , 1986, Applied optics.

[20]  J. C. Miellou,et al.  Algorithmes de relaxation chaotique à retards , 1975 .

[21]  François Robert Théorèmes de Perron-Frobenius et Stein-Rosenberg booléens , 1978 .

[22]  Jacques M. Bahi,et al.  Simulations of asynchronous evolution of discrete systems , 1999, Simul. Pract. Theory.

[23]  Joseph W. Goodman,et al.  A generalized convergence theorem for neural networks , 1988, IEEE Trans. Inf. Theory.