Information capacity in recurrent McCulloch-Pitts networks with sparsely coded memory states

A new access to the asymptotic analysis of autoassociation properties in recurrent McCulloch-Pitts networks in the range of low activity is proposed. Using information theory, this method examines the static structure of stable states imprinted by a Hebbian storing process. In addition to the definition of critical pattern capacity usually considered in the analysis of the Hopfield model, the authors introduce the definition of information capacity which guarantees content addressability and is a stricter upper bound of the information really accessible in an autoassociation process. They calculate these two types of capacities for two types of local learning rules which are very effective for sparsely coded patterns: the Hebb rule and the clipped Hebb rule. It turns out that for both rules the information capacity is exactly half the pattern capacity.