Increasing the efficiency of a neural network through unlearning

Abstract It has been suggested that dream (REM) sleep leads to unlearning of parasitic or spurious states. Here we present the results of an extensive numerical study of unlearning in a network of formal neurons (Ising spins) whose activity may vary. Our results are threefold. First, unlearning greatly improves the performance of the network; e.g., the storage capacity may be more than quadrupled. Second, the optimal number of unlearning steps (“dreams”) does not depend on the activity. Third, using the simplest form of Hebbian learning, the network can store and retrieve patterns whose activity differs. A microscopic picture of the underlying processes is presented.