Learning optimal spike-based representations

How can neural networks learn to represent information optimally? We answer this question by deriving spiking dynamics and learning dynamics directly from a measure of network performance. We find that a network of integrate-and-fire neurons undergoing Hebbian plasticity can learn an optimal spike-based representation for a linear decoder. The learning rule acts to minimise the membrane potential magnitude, which can be interpreted as a representation error after learning. In this way, learning reduces the representation error and drives the network into a robust, balanced regime. The network becomes balanced because small representation errors correspond to small membrane potentials, which in turn results from a balance of excitation and inhibition. The representation is robust because neurons become self-correcting, only spiking if the representation error exceeds a threshold. Altogether, these results suggest that several observed features of cortical dynamics, such as excitatory-inhibitory balance, integrate-and-fire dynamics and Hebbian plasticity, are signatures of a robust, optimal spike-based code.

[1]  Henning Sprekeler,et al.  Inhibitory Plasticity Balances Excitation and Inhibition in Sensory Pathways and Memory Networks , 2011, Science.

[2]  C. Petersen,et al.  Membrane Potential Dynamics of GABAergic Neurons in the Barrel Cortex of Behaving Mice , 2010, Neuron.

[3]  W. Gerstner,et al.  Connectivity reflects coding: a model of voltage-based STDP with homeostasis , 2010, Nature Neuroscience.

[4]  Ehud Zohary,et al.  Correlated neuronal discharge rate and its implications for psychophysical performance , 1994, Nature.

[5]  Sophie Denève,et al.  Spike-Based Population Coding and Working Memory , 2011, PLoS Comput. Biol..

[6]  J. Movshon,et al.  The statistical reliability of signals in single neurons in cat and monkey visual cortex , 1983, Vision Research.

[7]  Michael Okun,et al.  Instantaneous correlation of excitation and inhibition during ongoing and sensory-evoked activities , 2008, Nature Neuroscience.

[8]  Haim Sompolinsky,et al.  Chaotic Balanced State in a Model of Cortical Circuits , 1998, Neural Computation.

[9]  C. Petersen,et al.  Membrane Potential Dynamics of GABAergic Neurons in the Barrel Cortex of Awake Mice , 2010 .

[10]  Nicolas Brunel,et al.  Dynamics of Sparsely Connected Networks of Excitatory and Inhibitory Spiking Neurons , 2000, Journal of Computational Neuroscience.

[11]  L. Abbott,et al.  Neural network dynamics. , 2005, Annual review of neuroscience.

[12]  Christian K. Machens,et al.  Predictive Coding of Dynamical Variables in Balanced Spiking Networks , 2013, PLoS Comput. Biol..

[13]  W. Newsome,et al.  The Variable Discharge of Cortical Neurons: Implications for Connectivity, Computation, and Information Coding , 1998, The Journal of Neuroscience.

[14]  L. Abbott,et al.  Competitive Hebbian learning through spike-timing-dependent synaptic plasticity , 2000, Nature Neuroscience.

[15]  Alexander S. Ecker,et al.  Decorrelated Neuronal Firing in Cortical Microcircuits , 2010, Science.

[16]  D. McCormick,et al.  Turning on and off recurrent balanced cortical activity , 2003, Nature.

[17]  Y. Dan,et al.  Spike timing-dependent plasticity: a Hebbian learning rule. , 2008, Annual review of neuroscience.

[18]  P. Dayan,et al.  Supporting Online Material Materials and Methods Som Text Figs. S1 to S9 References the Asynchronous State in Cortical Circuits , 2022 .