A recurrent neural network: Limitations and training

Abstract A recurrent, synchronous neural network is treated as a collection of independent perceptrons. The dynamics of the network can be described by a mapping: a finite set of transitions in the state space of the network. We define legal mapping as a mapping that a synchronous neural network is able to perform, and state the necessary and sufficient conditions for a mapping to be legal. A learning algorithm for the network, based on the perceptron's learning algorithm, is guaranteed to converge to a solution when the network is trained to realize a legal mapping. It is shown that the algorithm performs a gradient descent search for a minimum of a cost function that is a certain error measure in the weight space. Performance of the algorithm for the associative memory application and for temporal sequences production is illustrated by numerical simulations. A method is proposed for legalizing any given mapping at the expense of adding a finite number of neurons to the network. It is also shown that when the number of transitions in a random mapping is less than the number of neurons in the network, the probability that such a mapping is legal approaches unity.

[1]  Amir Dembo,et al.  On the capacity of associative memories with linear threshold functions , 1989, IEEE Trans. Inf. Theory.

[2]  Yaser S. Abu-Mostafa,et al.  Information capacity of the Hopfield model , 1985, IEEE Trans. Inf. Theory.

[3]  Sompolinsky,et al.  Spin-glass models of neural networks. , 1985, Physical review. A, General physics.

[4]  J J Hopfield,et al.  Neural networks and physical systems with emergent collective computational abilities. , 1982, Proceedings of the National Academy of Sciences of the United States of America.

[5]  Geoffrey E. Hinton,et al.  Learning internal representations by error propagation , 1986 .

[6]  Opper,et al.  Learning of correlated patterns in spin-glass networks by local learning rules. , 1987, Physical review letters.

[7]  D. O. Hebb,et al.  The organization of behavior , 1988 .

[8]  D. J. Wallace,et al.  Training with noise and the storage of correlated patterns in a neural network model , 1989 .

[9]  Sompolinsky,et al.  Storing infinite numbers of patterns in a spin-glass model of neural networks. , 1985, Physical review letters.

[10]  Thomas M. Cover,et al.  Geometrical and Statistical Properties of Systems of Linear Inequalities with Applications in Pattern Recognition , 1965, IEEE Trans. Electron. Comput..

[11]  Esther Levin,et al.  Neural network architecture for adaptive system modeling and control , 1991, International 1989 Joint Conference on Neural Networks.

[12]  Santosh S. Venkatesh,et al.  The capacity of the Hopfield associative memory , 1987, IEEE Trans. Inf. Theory.

[13]  Frank Rosenblatt,et al.  PRINCIPLES OF NEURODYNAMICS. PERCEPTRONS AND THE THEORY OF BRAIN MECHANISMS , 1963 .