Analysis of the state space for the fully-connected binary neural network ("the Hopfield model") remains an important objective in utilizing the network in pattern recognition and associative information retrieval. Most of the research pertaining to the network's state space so far concentrated on stable-state enumeration and often it was assumed that the patterns which are to be stored are random. We discuss the case of deterministic known codewords whose storage is required, and show that for this important case bounds on the retrieval probabilities and convergence rates can be achieved. The main tool which we employ is Birth-and-Death Markov chains, describing the Hamming distance of the network's state from the stored patterns. The results are applicable to both the asynchronous network and to the Boltzmann machine, and can be utilized to compare codeword sets in terms of efficiency of their retrieval, when the neural network is used as a content addressable memory.
[1]
Sompolinsky,et al.
Storing infinite numbers of patterns in a spin-glass model of neural networks.
,
1985,
Physical review letters.
[2]
J J Hopfield,et al.
Neurons with graded response have collective computational properties like those of two-state neurons.
,
1984,
Proceedings of the National Academy of Sciences of the United States of America.
[3]
Shun-ichi Amari,et al.
Learning Patterns and Pattern Sequences by Self-Organizing Nets of Threshold Elements
,
1972,
IEEE Transactions on Computers.
[4]
Stephen Grossberg,et al.
Absolute stability of global pattern formation and parallel memory storage by competitive neural networks
,
1983,
IEEE Transactions on Systems, Man, and Cybernetics.