Contrastive Learning and Neural Oscillations

The concept of Contrastive Learning (CL) is developed as a family of possible learning algorithms for neural networks. CL is an extension of Deterministic Boltzmann Machines to more general dynamical systems. During learning, the network oscillates between two phases. One phase has a teacher signal and one phase has no teacher signal. The weights are updated using a learning rule that corresponds to gradient descent on a contrast function that measures the discrepancy between the free network and the network with a teacher signal. The CL approach provides a general unified framework for developing new learning algorithms. It also shows that many different types of clamping and teacher signals are possible. Several examples are given and an analysis of the landscape of the contrast function is proposed with some relevant predictions for the CL curves. An approach that may be suitable for collective analog implementations is described. Simulation results and possible extensions are briefly discussed together with a new conjecture regarding the function of certain oscillations in the brain. In the appendix, we also examine two extensions of contrastive learning to time-dependent trajectories.

[1]  Stephen Grossberg,et al.  Absolute stability of global pattern formation and parallel memory storage by competitive neural networks , 1983, IEEE Transactions on Systems, Man, and Cybernetics.

[2]  J J Hopfield,et al.  Neurons with graded response have collective computational properties like those of two-state neurons. , 1984, Proceedings of the National Academy of Sciences of the United States of America.

[3]  Geoffrey E. Hinton,et al.  A Learning Algorithm for Boltzmann Machines , 1985, Cogn. Sci..

[4]  Carsten Peterson,et al.  A Mean Field Theory Learning Algorithm for Neural Networks , 1987, Complex Syst..

[5]  Fernando J. Pineda,et al.  Dynamics and architecture for neural computation , 1988, J. Complex..

[6]  Geoffrey E. Hinton Deterministic Boltzmann Learning Performs Steepest Descent in Weight-Space , 1989, Neural Computation.

[7]  Amir F. Atiya,et al.  Oscillations and Synchronizations in Neural Networks: an Exploration of the Labeling Hypothesis , 1991, Int. J. Neural Syst..

[8]  Michail Zak,et al.  Terminal attractors in neural networks , 1989, Neural Networks.

[9]  W. Singer,et al.  Oscillatory responses in cat visual cortex exhibit inter-columnar synchronization which reflects global stimulus properties , 1989, Nature.

[10]  E Tulving,et al.  Priming and human memory systems. , 1990, Science.

[11]  Pierre Baldi,et al.  Computing with Arrays of Coupled Oscillators: An Application to Preattentive Texture Discrimination , 1990, Neural Computation.

[12]  J. Leo van Hemmen,et al.  Statistical Mechanics of Temporal Association in Neural Networks , 1990, NIPS.

[13]  Herz Global analysis of parallel analog networks with retarded feedback. , 1991, Physical review. A, Atomic, molecular, and optical physics.

[14]  Javier R. Movellan,et al.  Contrastive Hebbian Learning in the Continuous Hopfield Model , 1991 .

[15]  Peter König,et al.  Stimulus-Dependent Assembly Formation of Oscillatory Responses: III. Learning , 1992, Neural Computation.