An approach to training a generalized Hopfield network is developed and evaluated. Both the weight symmetricity constraint and the zero self-connection constraint are removed from standard Hopfield networks. Training is accomplished with backpropagation through time, using noisy versions of the memorized patterns. Training in this way is referred to as noisy associative training (NAT). Performance of NAT is evaluated on both random and correlated data. NAT has been tested on several data sets, with a large number of training runs for each experiment. The data sets used include uniformly distributed random data and several data sets adapted from the U.C. Irvine Machine Learning Repository. Results show that for random patterns, Hopfield networks trained with NAT have an average overall recall accuracy 6.1 times greater than networks produced with either Hebbian or pseudo-inverse training. Additionally, these networks have 13% fewer spurious memories on average than networks trained with pseudoinverse or Hebbian training. Typically, networks memorizing over 2N (where N is the number of nodes in the network) patterns are produced. Performance on correlated data shows an even greater improvement over networks produced with either Hebbian or pseudo-inverse training-an average of 27.8 times greater recall accuracy, with 14% fewer spurious memories.
[1]
Geoffrey E. Hinton,et al.
Learning internal representations by error propagation
,
1986
.
[2]
James L. McClelland,et al.
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1: foundations
,
1986
.
[3]
E. Capaldi,et al.
The organization of behavior.
,
1992,
Journal of applied behavior analysis.
[4]
Catherine Blake,et al.
UCI Repository of machine learning databases
,
1998
.
[5]
L. Personnaz,et al.
Collective computational properties of neural networks: New learning mechanisms.
,
1986,
Physical review. A, General physics.
[6]
Dmitry O. Gorodnichy,et al.
The optimal value of self-connection
,
1999,
IJCNN'99. International Joint Conference on Neural Networks. Proceedings (Cat. No.99CH36339).
[7]
I. Guyon,et al.
Information storage and retrieval in spin-glass like neural networks
,
1985
.
[8]
J J Hopfield,et al.
Neural networks and physical systems with emergent collective computational abilities.
,
1982,
Proceedings of the National Academy of Sciences of the United States of America.
[9]
Kanter,et al.
Associative recall of memory without errors.
,
1987,
Physical review. A, General physics.