On the Characteristics of the Autoassociative Memory with Nonzero-Diagonal Terms in the Memory Matrix

A statistical method is applied to explore the unique characteristics of a certain class of neural network autoassociative memory with N neurons and first-order synaptic interconnections. The memory matrix is constructed to store M = N vectors based on the outer-product learning algorithm. We theoretically prove that, by setting all the diagonal terms of the memory matrix to be M and letting the input error ratio = 0, the probability of successful recall Pr steadily decreases as increases, but as increases past 1.0, Pr begins to increase slowly. When 0 < 0.5, the network exhibits strong error-correction capability if 0.15 and this capability is shown to rapidly decrease as increases. The network essentially loses all its error-correction capability at = 2, regardless of the value of . When 0 < 0.5, and under the constraint of Pr > 0.99, the tradeoff between the number of stable states and their attraction force is analyzed and the memory capacity is shown to be 0.15N at best.