The Effect of Hebbian Learning on Optimisation in Hopfield Networks

In neural networks, two specific dynamical behaviours are well known: 1) Networks naturally find patterns of activation that locally minimise constraints among interactions. This can be understood as the local minimisation of an energy or potential function, or the optimisation of an objective function. 2) In distinct scenarios, Hebbian learning can create new interactions that form associative memories of activation patterns. In this paper we show that these two behaviours have a surprising interaction – that learning of this type significantly improves the ability of a neural network to find configurations that satisfy constraints/perform effective optimisation. Specifically, the network develops a memory of the attractors that it has visited, but importantly, is able to generalise over previously visited attractors to increase the basin of attraction of superior attractors before they are visited. The network is ultimately transformed into a different network that has only one basin of attraction, but this attractor corresponds to a configuration that is very low energy in the original network. The new network thus finds optimised configurations that were unattainable (had exponentially small basins of attraction) in the original network dynamics.

[1]  R. O’Reilly,et al.  Computational Explorations in Cognitive Neuroscience: Understanding the Mind by Simulating the Brain , 2000 .

[2]  Dan Dumitrescu,et al.  How can Artificial Neural Networks help making the intractable search spaces tractable , 2008, 2008 IEEE Congress on Evolutionary Computation (IEEE World Congress on Computational Intelligence).

[3]  Santosh S. Venkatesh,et al.  The capacity of the Hopfield associative memory , 1987, IEEE Trans. Inf. Theory.

[4]  Jeferson Jacob Arenzon,et al.  Categorization and generalization in the Hopfield model , 1992 .

[5]  C. D. Gelatt,et al.  Optimization by Simulated Annealing , 1983, Science.

[6]  Anthony V. Robins,et al.  Catastrophic Forgetting and the Pseudorehearsal Solution in Hopfield-type Networks , 1998, Connect. Sci..

[7]  Richard A. Watson Compositional Evolution - The Impact of Sex, Symbiosis, and Modularity on the Gradualist Framework of Evolution , 2006, The Vienna series in theoretical biology.

[8]  Geoffrey E. Hinton,et al.  OPTIMAL PERCEPTUAL INFERENCE , 1983 .

[9]  Richard A. Watson,et al.  Can Selfish Symbioses Effect Higher-Level Selection? , 2009, ECAL.

[10]  Kan Chen A general learning algorithm for solving optimization problems and its application to the spin glass problem , 1998 .

[11]  Chrisantha Fernando,et al.  Molecular circuits for associative learning in single-celled organisms , 2008, Journal of The Royal Society Interface.

[12]  José F. Fontanari,et al.  Generalization in a Hopfield network , 1990 .

[13]  Myung Won Kim,et al.  A conceptual interpretation of spurious memories in the Hopfield-type neural network , 1992, [Proceedings 1992] IJCNN International Joint Conference on Neural Networks.

[14]  Richard A. Watson,et al.  Symbiosis, Synergy and Modularity: Introducing the Reciprocal Synergy Symbiosis Algorithm , 2007, ECAL.

[15]  David E. Goldberg,et al.  Genetic Algorithm Design Inspired by Organizational Theory: Pilot Study of a Dependency Structure Matrix Driven Genetic Algorithm , 2003, GECCO.

[16]  David H. Wolpert,et al.  No free lunch theorems for optimization , 1997, IEEE Trans. Evol. Comput..

[17]  James L. McClelland,et al.  Parallel distributed processing: explorations in the microstructure of cognition, vol. 1: foundations , 1986 .

[18]  J. J. Hopfield,et al.  “Neural” computation of decisions in optimization problems , 1985, Biological Cybernetics.

[19]  J. Hopfield,et al.  Computing with neural circuits: a model. , 1986, Science.

[20]  Thomas Jansen,et al.  A building-block royal road where crossover is provably essential , 2007, GECCO '07.

[21]  Jean-Dominique Gascuel,et al.  An Internal Mechanism for Detecting Parasite Attractors in a Hopfield Network , 1994, Neural Computation.

[22]  Anthony V. Robins,et al.  Catastrophic Forgetting, Rehearsal and Pseudorehearsal , 1995, Connect. Sci..

[23]  David E. Goldberg,et al.  Learning Linkage , 1996, FOGA.

[24]  Richard A. Watson,et al.  Variable discrimination of crossover versus mutation using parameterized modular structure , 2007, GECCO '07.

[25]  Shumeet Baluja,et al.  A Method for Integrating Genetic Search Based Function Optimization and Competitive Learning , 1994 .

[26]  Dumitru Dumitrescu,et al.  Large-Scale Optimization of Non-separable Building-Block Problems , 2008, PPSN.

[27]  John J. Hopfield,et al.  Neural networks and physical systems with emergent collective computational abilities , 1999 .

[28]  W K Theumann,et al.  Generalization in a Hopfield network with noise , 1993 .

[29]  Koichi Tanno,et al.  A hill‐climbing learning method for Hopfield networks , 2001 .

[30]  Steven H. Strogatz,et al.  Nonlinear Dynamics and Chaos , 2024 .

[31]  Geoffrey E. Hinton,et al.  A Learning Algorithm for Boltzmann Machines , 1985, Cogn. Sci..

[32]  Gintaras V. Reklaitis,et al.  Nonlinear Optimization Using Generalized Hopfield Networks , 1989, Neural Computation.

[33]  Ralph Linsker,et al.  Self-organization in a perceptual network , 1988, Computer.

[34]  Gürsel Serpen,et al.  Hopfield Network as Static Optimizer: Learning the Weights and Eliminating the Guesswork , 2008, Neural Processing Letters.

[35]  J. Pollack,et al.  A computational model of symbiotic composition in evolutionary transitions. , 2003, Bio Systems.

[36]  Dan Dumitrescu,et al.  Overcoming hierarchical difficulty by hill-climbing the building block structure , 2007, GECCO '07.

[37]  J. J. Hopfield,et al.  ‘Unlearning’ has a stabilizing effect in collective memories , 1983, Nature.