The covariance perceptron: A new paradigm for classification and processing of time series in recurrent neuronal networks

Learning in neuronal networks has developed in many directions, in particular to reproduce cognitive tasks like image recognition and speech processing. Implementations have been inspired by stereotypical neuronal responses like tuning curves in the visual system, where, for example, ON/OFF cells fire or not depending on the contrast in their receptive fields. Classical models of neuronal networks therefore map a set of input signals to a set of activity levels in the output of the network. Each category of inputs is thereby predominantly characterized by its mean. In the case of time series, fluctuations around this mean constitute noise in this view. For this paradigm, the high variability exhibited by the cortical activity may thus imply limitations or constraints, which have been discussed for many years. For example, the need for averaging neuronal activity over long periods or large groups of cells to assess a robust mean and to diminish the effect of noise correlations. To reconcile robust computations with variable neuronal activity, we here propose a conceptual change of perspective by employing variability of activity as the basis for stimulus-related information to be learned by neurons, rather than merely being the noise that corrupts the mean signal. In this new paradigm both afferent and recurrent weights in a network are tuned to shape the input-output mapping for covariances, the second-order statistics of the fluctuating activity. When including time lags, covariance patterns define a natural metric for time series that capture their propagating nature. We develop the theory for classification of time series based on their spatio-temporal covariances, which reflect dynamical properties. We demonstrate that recurrent connectivity is able to transform information contained in the temporal structure of the signal into spatial covariances. Finally, we use the MNIST database to show how the covariance perceptron can capture specific second-order statistical patterns generated by moving digits. Author summary The dynamics in cortex is characterized by highly fluctuating activity: Even under the very same experimental conditions the activity typically does not reproduce on the level of individual spikes. Given this variability, how then does the brain realize its quasi-deterministic function? One obvious solution is to compute averages over many cells, assuming that the mean activity, or rate, is actually the decisive signal. Variability across trials of an experiment is thus considered noise. We here explore the opposite view: Can fluctuations be used to actually represent information? And if yes, is there a benefit over a representation using the mean rate? We find that a fluctuation-based scheme is not only powerful in distinguishing signals into several classes, but also that networks can efficiently be trained in the new paradigm. Moreover, we argue why such a scheme of representation is more consistent with known forms of synaptic plasticity than rate-based network dynamics.

[1]  Wolfgang Maass,et al.  Bayesian Computation Emerges in Generic Cortical Microcircuits through Spike-Timing-Dependent Plasticity , 2013, PLoS Comput. Biol..

[2]  Andrzej J. Kasinski,et al.  Supervised Learning in Spiking Neural Networks with ReSuMe: Sequence Learning, Classification, and Spike Shifting , 2010, Neural Computation.

[3]  Radford M. Neal Pattern Recognition and Machine Learning , 2007, Technometrics.

[4]  E. Oja Simplified neuron model as a principal component analyzer , 1982, Journal of mathematical biology.

[5]  T. Sejnowski,et al.  Reliability of spike timing in neocortical neurons. , 1995, Science.

[6]  S. Thorpe,et al.  Spike Timing Dependent Plasticity Finds the Start of Repeating Patterns in Continuous Spike Trains , 2008, PloS one.

[7]  L. F. Abbott,et al.  Generating Coherent Patterns of Activity from Chaotic Neural Networks , 2009, Neuron.

[8]  F. Guerra Spin Glasses , 2005, cond-mat/0507581.

[9]  Henning Sprekeler,et al.  Functional consequences of inhibitory plasticity: homeostasis, the excitation-inhibition balance and beyond , 2017, Current Opinion in Neurobiology.

[10]  Y. Kabashima,et al.  Perceptron capacity revisited: classification ability for correlated patterns , 2007, 0712.4050.

[11]  Qing Song,et al.  Robust spike-train learning in spike-event based weight update , 2017, Neural Networks.

[12]  Andrzej Cichocki,et al.  Equivariant Nonstationary Source Separation , 2002 .

[13]  Matthieu Gilson,et al.  Frontiers in Computational Neuroscience Computational Neuroscience , 2022 .

[14]  H. Sompolinsky,et al.  The tempotron: a neuron that learns spike timing–based decisions , 2006, Nature Neuroscience.

[15]  E. Gardner The space of interactions in neural network models , 1988 .

[16]  Anders Krogh,et al.  Introduction to the theory of neural computation , 1994, The advanced book program.

[17]  Wulfram Gerstner,et al.  Synaptic Plasticity in Neural Networks Needs Homeostasis with a Fast Rate Detector , 2013, PLoS Comput. Biol..

[18]  H. Sompolinsky,et al.  Computing Complex Visual Features with Retinal Spike Times , 2013, PloS one.

[19]  Matthieu Gilson,et al.  Capacity of the covariance perceptron , 2019, ArXiv.

[20]  Wulfram Gerstner,et al.  Predicting non-linear dynamics by stable local learning in a recurrent spiking neural network , 2017, eLife.

[21]  József Fiser,et al.  Neural Variability and Sampling-Based Probabilistic Representations in the Visual Cortex , 2016, Neuron.

[22]  Thomas M. Cover,et al.  Geometrical and Statistical Properties of Systems of Linear Inequalities with Applications in Pattern Recognition , 1965, IEEE Trans. Electron. Comput..

[23]  Yoshua Bengio,et al.  Gradient-based learning applied to document recognition , 1998, Proc. IEEE.

[24]  Sonja Grün,et al.  Long-Term Modifications in Motor Cortical Dynamics Induced by Intensive Practice , 2009, The Journal of Neuroscience.

[25]  A. Pouget,et al.  Information-limiting correlations , 2014, Nature Neuroscience.

[26]  Matthieu Gilson,et al.  Estimation of Directed Effective Connectivity from fMRI Functional Connectivity Hints at Asymmetries of Cortical Connectome , 2016, PLoS Comput. Biol..

[27]  E. Izhikevich Solving the distal reward problem through linkage of STDP and dopamine signaling , 2007, BMC Neuroscience.

[28]  Simona Cocco,et al.  Functional connectivity models for decoding of spatial representations from hippocampal CA1 recordings , 2016, Journal of Computational Neuroscience.

[29]  Jürgen Schmidhuber,et al.  Deep learning in neural networks: An overview , 2014, Neural Networks.

[30]  Christos Dimitrakakis,et al.  Network Self-Organization Explains the Statistics and Dynamics of Synaptic Connection Strengths in Cortex , 2013, PLoS Comput. Biol..

[31]  Misha Tsodyks,et al.  Recognition by Variance: Learning Rules for Spatiotemporal Patterns , 2006, Neural Computation.

[32]  G. Bi,et al.  Synaptic Modifications in Cultured Hippocampal Neurons: Dependence on Spike Timing, Synaptic Strength, and Postsynaptic Cell Type , 1998, The Journal of Neuroscience.

[33]  A. Pouget,et al.  Neural correlations, population coding and computation , 2006, Nature Reviews Neuroscience.

[34]  Corinna Cortes,et al.  Support-Vector Networks , 1995, Machine Learning.

[35]  André Grüning,et al.  Supervised Learning in Spiking Neural Networks for Precise Temporal Encoding , 2016, PloS one.

[36]  A. Grinvald,et al.  Dynamics of Ongoing Activity: Explanation of the Large Variability in Evoked Cortical Responses , 1996, Science.

[37]  Ronald J. Williams,et al.  A Learning Algorithm for Continually Running Fully Recurrent Neural Networks , 1989, Neural Computation.

[38]  J. Nadal,et al.  Information capacity of a perceptron , 1992 .

[39]  Pineda,et al.  Generalization of back-propagation to recurrent neural networks. , 1987, Physical review letters.

[40]  József Fiser,et al.  Spontaneous Cortical Activity Reveals Hallmarks of an Optimal Internal Model of the Environment , 2011, Science.

[41]  Shun-ichi Amari,et al.  Natural Gradient Works Efficiently in Learning , 1998, Neural Computation.

[42]  S. Thorpe,et al.  STDP-based spiking deep convolutional neural networks for object recognition , 2018 .

[43]  A. Hall,et al.  Adaptive Switching Circuits , 2016 .

[44]  Mattia Rigotti,et al.  A Simple Derivation of a Bound on the Perceptron Margin Using Singular Value Decomposition , 2011, Neural Computation.

[45]  Geoffrey E. Hinton,et al.  Learning representations by back-propagating errors , 1986, Nature.

[46]  H. Markram,et al.  Regulation of Synaptic Efficacy by Coincidence of Postsynaptic APs and EPSPs , 1997, Science.

[47]  Matthieu Gilson,et al.  STDP Allows Fast Rate-Modulated Coding with Poisson-Like Spike Trains , 2011, PLoS Comput. Biol..

[48]  L. F. Abbott,et al.  Supervised Learning Through Neuronal Response Modulation , 2005, Neural Computation.

[49]  Bernard Bercu,et al.  On Ornstein–Uhlenbeck driven by Ornstein–Uhlenbeck processes , 2012, 1212.2800.

[50]  Helmut Ltkepohl,et al.  New Introduction to Multiple Time Series Analysis , 2007 .

[51]  R. Kempter,et al.  Hebbian learning and spiking neurons , 1999 .

[52]  Matthieu Gilson,et al.  Spectral Analysis of Input Spike Trains by Spike-Timing-Dependent Plasticity , 2012, PLoS Comput. Biol..

[53]  Paul J. Werbos,et al.  Backpropagation Through Time: What It Does and How to Do It , 1990, Proc. IEEE.

[54]  F ROSENBLATT,et al.  The perceptron: a probabilistic model for information storage and organization in the brain. , 1958, Psychological review.

[55]  Guigang Zhang,et al.  Deep Learning , 2016, Int. J. Semantic Comput..

[56]  E. Capaldi,et al.  The organization of behavior. , 1992, Journal of applied behavior analysis.

[57]  A. Aertsen,et al.  Spike synchronization and rate modulation differentially involved in motor cortical function. , 1997, Science.

[58]  Surya Ganguli,et al.  SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks , 2017, Neural Computation.

[59]  Barak A. Pearlmutter Gradient calculations for dynamic recurrent neural networks: a survey , 1995, IEEE Trans. Neural Networks.