The basins of attraction of a new Hopfield learning rule

The nature of the basins of attraction of a Hopfield network is as important as the capacity. Here a new learning rule is re-introduced. This learning rule has a higher capacity than that of the Hebb rule, and still keeps important functionality, such as incrementality and locality, which the pseudo-inverse lacks. However the basins of attraction of the fixed points of this learning rule have not yet been studied. Three important characteristics of basins of attraction are considered: indirect and direct basins of attraction, distribution of sizes of basins of attraction and the shape of the basins of attraction. The results for the new learning rule are compared with those of the Hebb rule. The size of direct and indirect basins of attractions are generally larger for the new rule than for the Hebb rule, the distribution of sizes is more even, and the shape of the basins more round.

[1]  Amos J. Storkey,et al.  Increasing the Capacity of a Hopfield Network without Sacrificing Functionality , 1997, ICANN.

[2]  B. Forrest Content-addressability and learning in neural networks , 1988 .

[3]  Tao Wang,et al.  Improving recall in associative memories by dynamic threshold , 1994, Neural Networks.

[4]  Anthony C. C. Coolen,et al.  Attraction domains in neural networks , 1993 .

[5]  Günther Palm,et al.  Iterative retrieval of sparsely coded associative memory patterns , 1996, Neural Networks.

[6]  Santosh S. Venkatesh,et al.  The capacity of the Hopfield associative memory , 1987, IEEE Trans. Inf. Theory.

[7]  Chang-Cherng Wu,et al.  Design of Hopfield-type associative memory with maximal basin of attraction , 1993 .

[8]  H. Horner,et al.  Transients and basins of attraction in neutral network models , 1989 .

[9]  W. Krauth,et al.  Learning algorithms with optimal stability in neural networks , 1987 .

[10]  H. W. Yau,et al.  Basins of attraction in a neural network model trained with external fields , 1992 .

[11]  Amos Storkey,et al.  Hopfield learning rule with high capacity storage of time-correlated patterns , 1997 .

[12]  Philippe De Wilde,et al.  General Transient Length Upper Bound for Recurrent Neural Networks , 1995, IWANN.

[13]  Amos Storkey Palimpsest memories: a new high-capacity forgetful learning rule for Hopfield networks , 1998 .

[14]  J J Hopfield,et al.  Neural networks and physical systems with emergent collective computational abilities. , 1982, Proceedings of the National Academy of Sciences of the United States of America.

[15]  D. J. Wallace,et al.  Enlarging the attractor basins of neural networks with noisy external fields , 1991 .

[16]  Shun-ichi Amari,et al.  Statistical neurodynamics of associative memory , 1988, Neural Networks.

[17]  Kanter,et al.  Associative recall of memory without errors. , 1987, Physical review. A, General physics.