High capacity associative memories and connection constraints

High capacity associative neural networks can be built from networks of perceptrons, trained using simple perceptron training. Such networks perform much better than those trained using the standard Hopfield one-shot Hebbian learning. An experimental investigation into how such networks perform when the connection weights are not free to take any value is reported. The three restrictions investigated are: a symmetry constraint, a sign constraint and a dilution constraint. The selection of these constraints is motivated by both engineering and biological considerations.

[1]  R. Palmer,et al.  Introduction to the theory of neural computation , 1994, The advanced book program.

[2]  Raju R. Viswanathan,et al.  Sign-constrained synapses and biased patterns in neural networks , 1993 .

[3]  N. Davey,et al.  A comparative analysis of high performance associative memory models. , 2000 .

[4]  Opper,et al.  Learning of correlated patterns in spin-glass networks by local learning rules. , 1987, Physical review letters.

[5]  E. Gardner The space of interactions in neural network models , 1988 .

[6]  Thomas B. Kepler,et al.  Domains of attraction in neural networks , 1988 .

[7]  Pekka Orponen,et al.  Attraction Radii in Binary Hopfield Nets are Hard to Compute , 1993, Neural Computation.

[8]  D. Amit,et al.  Perceptron learning with sign-constrained weights , 1989 .

[9]  Kanter,et al.  Associative recall of memory without errors. , 1987, Physical review. A, General physics.

[10]  L. Personnaz,et al.  Collective computational properties of neural networks: New learning mechanisms. , 1986, Physical review. A, General physics.

[11]  H. Dale Pharmacology and Nerve-Endings , 1935 .

[12]  Anders Krogh,et al.  Introduction to the theory of neural computation , 1994, The advanced book program.

[13]  C. Campbell,et al.  On the Storage Capacity of Neural Networks with Sign-Constrained Weights , 1991 .

[14]  W. Krauth,et al.  Learning algorithms with optimal stability in neural networks , 1987 .

[15]  D J Amit,et al.  The interaction space of neural networks with sign-constrained synapses , 1989 .

[16]  Francesco E. Lauria,et al.  On a Learning Neural Network , 1988 .

[17]  L. F. Abbott Learning in neural network memories , 1990 .

[18]  Duncan J. Watts,et al.  Collective dynamics of ‘small-world’ networks , 1998, Nature.

[19]  Ali A. Minai,et al.  Efficient associative memory using small-world architecture , 2001, Neurocomputing.

[20]  K. Y. Wong,et al.  Competitive attraction in neural networks with sign-constrained weights , 1992 .

[21]  G. Nardulli,et al.  Domains of attraction of neural networks at finite temperature , 1991 .

[22]  Marc Mézard,et al.  The roles of stability and symmetry in the dynamics of neural networks , 1988 .

[23]  Sompolinsky,et al.  Neural networks with nonlinear synapses and a static noise. , 1986, Physical review. A, General physics.

[24]  H. Gutfreund,et al.  The phase space of interactions in neural networks with definite symmetry , 1989 .

[25]  Neil Davey,et al.  High performance associative memory models and symmetric connections , 2000 .

[26]  Amos J. Storkey,et al.  The basins of attraction of a new Hopfield learning rule , 1999, Neural Networks.