On the asymptotic information storage capacity of neural networks

Neural networks can be useful and economic as associative memories, even in technical applications. The asymptotic information storage capacity of such neural networks is defined and then calculated and compared for various local synaptic rules. It turns out that among these rules the simple Hebb rule is optimal in terms of its storage capacity. Furthermore the capacity of the clipped Hebb rule (C = In 2) is even higher than the capacity of the unclipped Hebb rule (C = 1/(8-ln 2)).