Initial state randomness improves sequence learning in a model hippocampal network.

Randomness can be a useful component of computation. Using a computationally minimal, but still biologically based model of the hippocampus, we evaluate the effects of initial state randomization on learning a cognitive problem that requires this brain structure. Greater randomness of initial states leads to more robust performance in simulations of the cognitive task called transverse patterning, a context-dependent discrimination task that we code as a sequence prediction problem. At the conclusion of training, greater initial randomness during training trials also correlates with increased, repetitive firing of select individual neurons, previously named local context neurons. In essence, such repetitively firing neurons recognize subsequences, and previously their presence has been correlated with solving the transverse patterning problem. A more detailed analysis of the simulations across training trials reveals more about initial state randomization. The beneficial effects of initial state randomization derive from enhanced variation, across training trials, of the sequential states of a network. This greater variation is not uniformly present during training; it is largely restricted to the beginning of training and when novel sequences are introduced. Little such variation occurs after extensive or even moderate amounts of training. We explain why variation is high early in training, but not later. This automatic modulation of the initial-state-driven random variation through state space is reminiscent of simulated annealing where modulated randomization encourages a selectively broad search through state space. In contrast to an annealing schedule, the selective occurrence of such a random search here is an emergent property, and the critical randomization occurs during training rather than testing.

[1]  C. Buhusi,et al.  Attention, configuration, and hippocampal function , 1996, Hippocampus.

[2]  E. Rolls,et al.  Neural networks and brain function , 1998 .

[3]  Frank Moss,et al.  Stochastic Resonance in Ensembles of Nondynamical Elements: The Role of Internal Noise , 1997 .

[4]  D. Wilkin,et al.  Neuron , 2001, Brain Research.

[5]  C. D. Gelatt,et al.  Optimization by Simulated Annealing , 1983, Science.

[6]  J W Rudy,et al.  The effect of age on children's learning of problems that require a configural association solution. , 1993, Developmental psychobiology.

[7]  Kenneth W. Spence,et al.  The nature of the response in discrimination learning. , 1952 .

[8]  M. Hasselmo,et al.  Neuromodulation and the hippocampus: memory function and dysfunction in a network simulation. , 1999, Progress in brain research.

[9]  L. F. Abbott,et al.  A Model of Spatial Map Formation in the Hippocampus of the Rat , 1999, Neural Computation.

[10]  H. Eichenbaum,et al.  The hippocampus and memory for orderly stimulus relations. , 1997, Proceedings of the National Academy of Sciences of the United States of America.

[11]  James L. McClelland,et al.  Why there are complementary learning systems in the hippocampus and neocortex: insights from the successes and failures of connectionist models of learning and memory. , 1995, Psychological review.

[12]  W. Levy,et al.  Controlling activity fluctuations in large, sparsely connected random networks , 2000, Network.

[13]  W. Levy,et al.  Synapses as associative memory elements in the hippocampal formation , 1979, Brain Research.

[14]  Gordon H. Bower,et al.  Computational models of learning in simple neural systems , 1989 .

[15]  William B. Levy,et al.  The dynamics of sparse random networks , 1993, Biological Cybernetics.

[16]  W. Levy,et al.  Temporal contiguity requirements for long-term associative potentiation/depression in the hippocampus , 1983, Neuroscience.

[17]  Hans Liljenström,et al.  Noise-enhanced performance in a cortical associative memory model , 1995, Int. J. Neural Syst..

[18]  W B Levy,et al.  A sequence predicting CA3 is a flexible associator that learns and uses context to solve hippocampal‐like tasks , 1996, Hippocampus.

[19]  William B. Levy,et al.  A neural network solution to the transverse patterning problem depends on repetition of the input code , 1998, Biological Cybernetics.

[20]  Michael E. Hasselmo,et al.  Changes in GABAB Modulation During a Theta Cycle May Be Analogous to the Fall of Temperature During Annealing , 1998, Neural Computation.

[21]  J. Buhmann,et al.  Influence of noise on the function of a “physiological” neural network , 1987, Biological Cybernetics.

[22]  R. Passingham The hippocampus as a cognitive map J. O'Keefe & L. Nadel, Oxford University Press, Oxford (1978). 570 pp., £25.00 , 1979, Neuroscience.

[23]  William B. Levy,et al.  Dynamic control of inhibition improves performance of a hippocampal model , 2001, Neurocomputing.

[24]  Donald Geman,et al.  Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of Images , 1984, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[25]  M. Quirk,et al.  Experience-Dependent Asymmetric Shape of Hippocampal Receptive Fields , 2000, Neuron.

[26]  M. Hasselmo,et al.  GABAergic modulation of hippocampal population activity: sequence learning, place field development, and the phase precession effect. , 1997, Journal of neurophysiology.

[27]  William B. Levy,et al.  Context codes and the effect of noisy learning on a simplified hippocampal CA3 model , 1996, Biological Cybernetics.

[28]  J J Hopfield,et al.  Neural networks and physical systems with emergent collective computational abilities. , 1982, Proceedings of the National Academy of Sciences of the United States of America.

[29]  William B. Levy,et al.  Enhancing the performance of a hippocampal model by increasing variability early in learning , 1999, Neurocomputing.

[30]  Asohan Amarasingham,et al.  Predicting the Distribution of Synaptic Strengths and Cell Firing Correlations in a Self-Organizing, Sequence Prediction Model , 1998, Neural Computation.

[31]  William B. Levy,et al.  Using computational simulations to discover optimal training paradigms , 2000, Neurocomputing.

[32]  Risto Miikkulainen,et al.  Computational Neuroscience: Trends in Research, 1998 , 1998 .