Maximum Entropy Connections: Neural Networks

Maximum entropy estimation of probability distributions constitutes a theoretical foundation for the Hopfield associative memory. Subject to knowledge of the first and second order statistics of a collection of binary variables, the maximum entropy distribution is the exponential of a Hopfield network’s energy. In a special case, an explicit expression for the connection strengths in this maximum entropy net is given; this converges exactly to Hopfield’s covariance prescription in the limit of large numbers of random patterns. This connection between probabilistic inference and neural networks gives a viewpoint on the effective assumptions and approximations being made when a Hopfield network is used as an associative memory, and motivates several modifications to the original algorithms.

[1]  John Scott Bridle,et al.  Probabilistic Interpretation of Feedforward Classification Network Outputs, with Relationships to Statistical Pattern Recognition , 1989, NATO Neurocomputing.

[2]  T. Sejnowski Higher‐order Boltzmann machines , 1987 .

[3]  Nava Rubin,et al.  Neural networks with low local firing rates , 1989 .

[4]  D. Amit,et al.  Statistical mechanics of neural networks near saturation , 1987 .

[5]  J. Buhmann,et al.  Associative memory with high information content. , 1989, Physical review. A, General physics.

[6]  John Skilling,et al.  Maximum Entropy and Bayesian Methods , 1989 .

[7]  Peter Dayan,et al.  Optimal Plasticity from Matrix Memories: What Goes Up Must Come Down , 1990, Neural Computation.

[8]  S. P. Luttrell,et al.  The Use of Bayesian and Entropic Methods in Neural Network Theory , 1989 .

[9]  J J Hopfield,et al.  Neural networks and physical systems with emergent collective computational abilities. , 1982, Proceedings of the National Academy of Sciences of the United States of America.

[10]  Daniel J. Amit,et al.  Neural potentials as stimuli for attractor neural networks , 1990 .

[11]  H W Yau,et al.  Basins of Attraction in Neural Network Models Trained with External Fields , 1991 .

[12]  Andreas Engel,et al.  Improved retrieval in neural networks with external fields , 1989 .

[13]  E. Gardner,et al.  Maximum Storage Capacity in Neural Networks , 1987 .

[14]  Carsten Peterson,et al.  A Mean Field Theory Learning Algorithm for Neural Networks , 1987, Complex Syst..

[15]  Geoffrey E. Hinton,et al.  OPTIMAL PERCEPTUAL INFERENCE , 1983 .

[16]  Sompolinsky,et al.  Information storage in neural networks with low levels of activity. , 1987, Physical review. A, General physics.