We complement our previous work [arxiv: 0707.0565] with the full (non diluted) solution describing the stable states of an attractor network that stores correlated patterns of activity. The new solution provides a good fit of simulations of a network storing the feature norms of McRae and colleagues [McRae et al, 2005], experimentally obtained combinations of features representing concepts in semantic memory. We discuss three ways to improve the storage capacity of the network: adding uninformative neurons, removing informative neurons and introducing popularity-modulated hebbian learning. We show that if the strength of synapses is modulated by an exponential decay of the popularity of the pre-synaptic neuron, any distribution of patterns can be stored and retrieved with approximately an optimal storage capacity - i.e, C ~ I.p, the minimum number of connections per neuron needed to sustain the retrieval of a pattern is proportional to the information content of the pattern multiplied by the number of patterns stored in the network.
[1]
D. J. Wallace,et al.
Training with noise and the storage of correlated patterns in a neural network model
,
1989
.
[2]
D. Sagi,et al.
Dynamics of Memory Representations in Networks with Novelty-Facilitated Synaptic Plasticity
,
2006,
Neuron.
[3]
Mark S. Seidenberg,et al.
Semantic feature production norms for a large set of living and nonliving things
,
2005,
Behavior research methods.
[4]
M. Shiino,et al.
Self-consistent signal-to-noise analysis and its application to analogue neural networks with asymmetric connections
,
1992
.
[5]
Gutfreund.
Neural networks with hierarchically correlated patterns.
,
1988,
Physical review. A, General physics.
[6]
M. L. Lambon Ralph,et al.
Prototypicality, distinctiveness, and intercorrelation: Analyses of the semantic attributes of living and nonliving concepts
,
2001,
Cognitive neuropsychology.