LEARNING IN NEURAL NETWORKS WITH PARTIALLY STRUCTURED SYNAPTIC TRANSITIONS

We show that stochastic learning of attractors can take place in a situation in which either only potentiation or only depression of synaptic efficacies is caused in a structured Hebbian way. In each case, the transition in the opposite sense takes place at random, but occurs only upon presentation of a stimulus. The outcome is an associative memory with the palimpsest property. It is shown that structured potentiation produces more effective learning than structured depression, i.e. it creates a network with a much higher number of retrievable memories.