Perceptron Hamming-stability learning rule for Hopfield associative memory

In the paper, we are to design the optimal learning rule for the Hopfield associative memory (HAM) based on three well recognized criteria, that is, all desired attractors must be made not only isolately stable but also asymptotically stable, and the spurious stable states should be the fewest possible. To construct a satisfactory associative memory, those criteria are crucial. In the paper, we first analyze the real cause of the unsatisfactory performance of the Hebb rule and many other existing learning rules designed for HAMs and then show that three criteria actually amount to widely expanding the basin of attraction around each desired attractor. One effective way to widely expand basins of attraction of all desired attractors is to appropriately dig their respective steep kernel basin of attraction. For this, we introduce a concept called by the Hamming-stability. Surprisingly, we find that the Hamming-stability for all desired attractors can be reduced to a moderately expansive linear separability condition at each neuron and thus the well known Rosenblatt's perceptron learning rule is the right one for learning the Hamming-stability. Extensive experiments were conducted, convincingly showing that the proposed perceptron Hamming-stability learning rule did take good care of three optimal criteria.

[1]  Alexander Moopenn,et al.  Electronic Implementation of Associative Memory Based on Neural Network Models , 1987, IEEE Transactions on Systems, Man, and Cybernetics.

[2]  Jehoshua Bruck,et al.  On the number of spurious memories in the Hopfield model , 1990, IEEE Trans. Inf. Theory.

[3]  Santosh S. Venkatesh,et al.  The capacity of the Hopfield associative memory , 1987, IEEE Trans. Inf. Theory.

[4]  Demetri Psaltis,et al.  Linear and logarithmic capacities in associative neural networks , 1989, IEEE Trans. Inf. Theory.

[5]  M. Naraghi-Pour,et al.  Neural network design using linear programming and relaxation , 1990, IEEE International Symposium on Circuits and Systems.

[6]  Robert Fischl,et al.  Design of the fully connected binary neural network via linear programming , 1990, IEEE International Symposium on Circuits and Systems.

[7]  A.N. Michel,et al.  Associative memories via artificial neural networks , 1990, IEEE Control Systems Magazine.