Graded-response neurons and information encodings in autoassociative memories.

A general mean-field theory is presented for an attractor neural network in which each elementary unit is described by one input and one output real variable, and whose synaptic strengths are determined by a covariance imprinting rule. In the case of threshold-linear units, a single equation is shown to yield the storage capacity for the retrieval of random activity patterns drawn from any given probability distribution. If this distribution produces binary patterns, the storage capacity is essentially the same as for networks of binary units. To explore the effects of storing more structured patterns, the case of a ternary distribution is studied. It is shown that the number of patterns that can be stored can be much higher than in the binary case, whereas the total amount of retrievable information does not exceed the limit obtained with binary patterns.