Overlapping cell assemblies from correlators

Abstract It has been hypothesised that recurrent neural networks, known as Cell Assemblies (CAs), are the neural basis of concepts. One of the essential properties of such a network is that neurons can participate in multiple CAs; the CAs overlap. We derive a correlatory Hebbian learning rule that makes the synaptic weight approximate the percentage of time the postsynaptic neuron fires when the presynaptic neuron fires. We then modify this rule to a compensatory rule that normalizes the total synaptic weight leaving a neuron. This rule allows sparse patterns to form CAs, and eases the formation of overlapping CAs. Simulations show the formation of overlapping CAs.