Dynamics of neural network operations

This chapter focuses on the dynamics of neural network operations. Network dynamics are important because they specify how a network will perform during its operation. Network structures and dynamics together allow specifying the functional abilities of a neural network that relate to its potential. Every neural network has a specific set of network dynamics or a particular way in which it processes data. These dynamics are related to its structure. Concern about the stability of network dynamics has prompted a number of theoretical considerations. Most of the investigations into network stability are based on an energy function approach to modeling network dynamics. In this context, the Lyapunov Theorem has proved to be very useful. The essence of this theorem is that if an energy function for a network is defined and if this function meets certain criteria, then the system will be globally stable.

[1]  Geoffrey E. Hinton,et al.  Learning internal representations by error propagation , 1986 .

[2]  P. Werbos,et al.  Beyond Regression : "New Tools for Prediction and Analysis in the Behavioral Sciences , 1974 .

[4]  J. Hopfield,et al.  Computing with neural circuits: a model. , 1986, Science.

[5]  J J Hopfield,et al.  Collective computation in neuronlike circuits. , 1987, Scientific American.

[6]  J J Hopfield,et al.  Neural networks and physical systems with emergent collective computational abilities. , 1982, Proceedings of the National Academy of Sciences of the United States of America.

[7]  J J Hopfield,et al.  Neurons with graded response have collective computational properties like those of two-state neurons. , 1984, Proceedings of the National Academy of Sciences of the United States of America.

[8]  B. Kosko,et al.  Feedback stability and unsupervised learning , 1988, IEEE 1988 International Conference on Neural Networks.