A stochastic population approach to the problem of stable recruitment hierarchies in spiking neural networks

Synchrony-driven recruitment learning addresses the question of how arbitrary concepts, represented by synchronously active ensembles, may be acquired within a randomly connected static graph of neuron-like elements. Recruitment learning in hierarchies is an inherently unstable process. This paper presents conditions on parameters for a feedforward network to ensure stable recruitment hierarchies. The parameter analysis is conducted by using a stochastic population approach to model a spiking neural network. The resulting network converges to activate a desired number of units at each stage of the hierarchy. The original recruitment method is modified first by increasing feedforward connection density for ensuring sufficient activation, then by incorporating temporally distributed feedforward delays for separating inputs temporally, and finally by limiting excess activation via lateral inhibition. The task of activating a desired number of units from a population is performed similarly to a temporal k-winners-take-all network.

[1]  Kiichi Urahama,et al.  K-winners-take-all circuit with O(N) complexity , 1995, IEEE Trans. Neural Networks.

[2]  Wolfgang Maass,et al.  A Model for Fast Analog Computation Based on Unreliable Synapses , 2000, Neural Computation.

[3]  Stéphane Badel,et al.  A VLSI Hamming artificial neural network with k-winner-take-all and k-loser-take-all capability , 2003, Proceedings of the International Joint Conference on Neural Networks, 2003..

[4]  Albert H. Titus,et al.  Toward an analog VLSI implementation of adaptive resonance theory (ART2) , 2003, Proceedings of the International Joint Conference on Neural Networks, 2003..

[5]  Cengiz Günay,et al.  Using temporal binding for hierarchical recruitment of conjunctive concepts over delayed lines , 2006, Neurocomputing.

[6]  Shun-ichi Amari,et al.  A method of statistical neurodynamics , 1974, Kybernetik.

[7]  Martin Schneider,et al.  Activity-Dependent Development of Axonal and Dendritic Delays, or, Why Synaptic Transmission Should Be Unreliable , 2002, Neural Computation.

[8]  W B Levy,et al.  A sequence predicting CA3 is a flexible associator that learns and uses context to solve hippocampal‐like tasks , 1996, Hippocampus.

[9]  Jerome A. Feldman,et al.  Connectionist Models and Their Properties , 1982, Cogn. Sci..

[10]  Eugenius Kaszkurewicz,et al.  A winner-take-all circuit based on second order Hopfield neural networks as building blocks , 2003, Proceedings of the International Joint Conference on Neural Networks, 2003..

[11]  Andreas Knoblauch,et al.  Pattern separation and synchronization in spiking associative memories and visual areas , 2001, Neural Networks.

[12]  Jerome A. Feldman,et al.  Dynamic connections in neural networks , 1990, Biological Cybernetics.

[13]  Alexandros V. Gerbessiotis Random Graphs In A Neural Computation Model , 2003, Int. J. Comput. Math..

[14]  L. Shastri,et al.  From simple associations to systematic reasoning: A connectionist representation of rules, variables and dynamic bindings using temporal synchrony , 1993, Behavioral and Brain Sciences.

[15]  Wulfram Gerstner,et al.  Spiking neurons , 1999 .

[16]  Alexandros V. Gerbessiotis,et al.  Topics in parallel and distributed computation , 1993 .

[17]  M. Page,et al.  Connectionist modelling in psychology: A localist manifesto , 2000, Behavioral and Brain Sciences.

[18]  Cengiz Günay,et al.  Hierarchical learning of conjunctive concepts in spiking neural networks , 2003 .

[19]  Michael N. Shadlen,et al.  Noise, neural codes and cortical organization , 1994, Current Opinion in Neurobiology.

[20]  Jerome A. Feldman,et al.  Computational constraints on higher neural representations , 1993 .

[21]  Wolfgang Maass,et al.  On the Computational Power of Winner-Take-All , 2000, Neural Computation.

[22]  E. Kaszkurewicz,et al.  On a class of globally stable neural circuits , 1994 .

[23]  Christoph von der Malsburg,et al.  The Correlation Theory of Brain Function , 1994 .

[24]  C. Gunay,et al.  Using temporal binding for connectionist recruitment learning over delayed lines , 2003, Proceedings of the International Joint Conference on Neural Networks, 2003..

[25]  J J Hopfield,et al.  Neural networks and physical systems with emergent collective computational abilities. , 1982, Proceedings of the National Academy of Sciences of the United States of America.

[26]  F. Sommer,et al.  Neural Associative Memories , 1993 .

[27]  Axel Cleeremans,et al.  The Unity of Consciousness: Binding, Integration, and Dissociation , 2003 .

[28]  Sabri Arik,et al.  A note on the global stability of dynamical neural networks , 2002 .

[29]  Geoffrey E. Hinton,et al.  Distributed Representations , 1986, The Philosophy of Artificial Intelligence.

[30]  Lokendra Shastri,et al.  Recruitment of binding and binding-error detector circuits via long-term potentiation , 1999, Neurocomputing.

[31]  Leslie G. Valiant,et al.  Circuits of the mind , 1994 .

[32]  Dan Hammerstrom,et al.  Platform performance comparison of PALM network on Pentium 4 and FPGA , 2003, Proceedings of the International Joint Conference on Neural Networks, 2003..

[33]  Cengiz Günay,et al.  Using Temporal Binding for Robust Connectionist Recruitment Learning over Delayed Lines ∗ , 2003 .

[34]  T. Poggio,et al.  Nonlinear interactions in a dendritic tree: localization, timing, and role in information processing. , 1983, Proceedings of the National Academy of Sciences of the United States of America.

[35]  Alexandros V. Gerbessiotis,et al.  A Graph-theoretic Result for a Model of Neural Computation , 1998, Discret. Appl. Math..

[36]  Charles L. Phillips,et al.  Feedback Control Systems , 1988 .

[37]  C. Stevens,et al.  Heterogeneity of Release Probability, Facilitation, and Depletion at Central Synapses , 1997, Neuron.

[38]  Mark C. W. van Rossum,et al.  Fast Propagation of Firing Rates through Layered Networks of Noisy Neurons , 2002, The Journal of Neuroscience.

[39]  R. O’Reilly,et al.  Three forms of binding and their neural substrates: Alternatives to temporal synchrony , 2003 .

[40]  Cengiz Günay,et al.  Temporal binding as an inducer for connectionist recruitment learning over delayed lines , 2003, Neural Networks.

[41]  William B. Levy,et al.  Setting the Activity Level in Sparse Random Networks , 1994, Neural Computation.

[42]  Idan Segev,et al.  On the Transmission of Rate Code in Long Feedforward Networks with Excitatory–Inhibitory Balance , 2003, The Journal of Neuroscience.

[43]  Tao Xiong,et al.  A combined SVM and LDA approach for classification , 2005, Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005..

[44]  Giacomo Indiveri,et al.  Modeling Selective Attention Using a Neuromorphic Analog VLSI Device , 2000, Neural Computation.

[45]  C. Malsburg Binding in models of perception and brain function , 1995, Current Opinion in Neurobiology.

[46]  S. Grossberg,et al.  Pattern formation, contrast control, and oscillations in the short term memory of shunting on-center off-surround networks , 1975, Biological Cybernetics.

[47]  Lokendra Shastri,et al.  Biological Grounding of Recruitment Learning and Vicinal Algorithms in Long-Term Potentiation , 2001, Emergent Neural Computational Architectures Based on Neuroscience.

[48]  Marco Wiering,et al.  Proceedings of the International Joint Conference on Neural Networks, IJCNN 2007, Celebrating 20 years of neural networks, Orlando, Florida, USA, August 12-17, 2007 , 2007, IJCNN.

[49]  Leslie G. Valiant,et al.  A neuroidal architecture for cognitive computation , 1998, ICALP.

[50]  Ad Aertsen,et al.  Stable propagation of synchronous spiking in cortical neural networks , 1999, Nature.

[51]  C. Gunay,et al.  The required measures of phase segregation in distributed cortical processing , 2001, IJCNN'01. International Joint Conference on Neural Networks. Proceedings (Cat. No.01CH37222).

[52]  Charles L. Phillips,et al.  Feedback control systems (2nd ed.) , 1991 .

[53]  R. Traub,et al.  Fast Oscillations in Cortical Circuits , 1999 .

[54]  Markus Diesmann,et al.  The ground state of cortical feed-forward networks , 2002, Neurocomputing.

[55]  Corneliu A. Marinov,et al.  Another K-winners-take-all analog neural network , 2000, IEEE Trans. Neural Networks Learn. Syst..

[56]  T. Sejnowski,et al.  Reliability of spike timing in neocortical neurons. , 1995, Science.

[57]  William B. Levy,et al.  The dynamics of sparse random networks , 1993, Biological Cybernetics.

[58]  W. Newsome,et al.  The Variable Discharge of Cortical Neurons: Implications for Connectivity, Computation, and Information Coding , 1998, The Journal of Neuroscience.