On the State Space of the Binary Neural Network

Analysis of the state space for the fully-connected binary neural network ("the Hopfield model") remains an important objective in utilizing the network in pattern recognition and associative information retrieval. Most of the research pertaining to the network's state space so far concentrated on stable-state enumeration and often it was assumed that the patterns which are to be stored are random. We discuss the case of deterministic known codewords whose storage is required, and show that for this important case bounds on the retrieval probabilities and convergence rates can be achieved. The main tool which we employ is Birth-and-Death Markov chains, describing the Hamming distance of the network's state from the stored patterns. The results are applicable to both the asynchronous network and to the Boltzmann machine, and can be utilized to compare codeword sets in terms of efficiency of their retrieval, when the neural network is used as a content addressable memory.

[1]  Geoffrey E. Hinton,et al.  A Learning Algorithm for Boltzmann Machines , 1985, Cogn. Sci..

[2]  J J Hopfield,et al.  Neurons with graded response have collective computational properties like those of two-state neurons. , 1984, Proceedings of the National Academy of Sciences of the United States of America.

[3]  Santosh S. Venkatesh,et al.  The capacity of the Hopfield associative memory , 1987, IEEE Trans. Inf. Theory.

[4]  Yaser S. Abu-Mostafa,et al.  Information capacity of the Hopfield model , 1985, IEEE Trans. Inf. Theory.

[5]  Shun-ichi Amari,et al.  Learning Patterns and Pattern Sequences by Self-Organizing Nets of Threshold Elements , 1972, IEEE Transactions on Computers.

[6]  Pierre Peretto,et al.  Stochastic Dynamics of Neural Networks , 1986, IEEE Transactions on Systems, Man, and Cybernetics.

[7]  J J Hopfield,et al.  Neural networks and physical systems with emergent collective computational abilities. , 1982, Proceedings of the National Academy of Sciences of the United States of America.

[8]  Sompolinsky,et al.  Storing infinite numbers of patterns in a spin-glass model of neural networks. , 1985, Physical review letters.