Online reservoir adaptation by intrinsic plasticity for backpropagation-decorrelation and echo state learning

We propose to use a biologically motivated learning rule based on neural intrinsic plasticity to optimize reservoirs of analog neurons. This rule is based on an information maximization principle, it is local in time and space and thus computationally efficient. We show experimentally that it can drive the neurons' output activities to approximate exponential distributions. Thereby it implements sparse codes in the reservoir. Because of its incremental nature, the intrinsic plasticity learning is well suited for joint application with the online backpropagation-decorrelation or the least mean squares reservoir learning, whose performance can be strongly improved. We further show that classical echo state regression can also benefit from reservoirs, which are pre-trained on the given input signal with the implicit plasticity rule.

[1]  Jochen Triesch,et al.  A Gradient Rule for the Plasticity of a Neuron's Intrinsic Excitability , 2005, ICANN.

[2]  Lisheng Wang,et al.  Sufficient and necessary conditions for global exponential stability of discrete-time recurrent neural networks , 2006, IEEE Transactions on Circuits and Systems I: Regular Papers.

[3]  D. Debanne,et al.  Long-term plasticity of intrinsic excitability: learning rules and mechanisms. , 2003, Learning & memory.

[4]  Wolfgang Maass,et al.  Movement Generation with Circuits of Spiking Neurons , 2005, Neural Computation.

[5]  Nils Bertschinger,et al.  Real-Time Computation at the Edge of Chaos in Recurrent Neural Networks , 2004, Neural Computation.

[6]  Jochen J. Steil Stability of backpropagation-decorrelation efficient O(N) recurrent learning , 2005, ESANN.

[7]  E. Marder,et al.  Plasticity in single neuron and circuit computations , 2004, Nature.

[8]  Harald Haas,et al.  Harnessing Nonlinearity: Predicting Chaotic Systems and Saving Energy in Wireless Communication , 2004, Science.

[9]  D. Linden,et al.  The other side of the engram: experience-driven changes in neuronal intrinsic excitability , 2003, Nature Reviews Neuroscience.

[10]  Henry Markram,et al.  Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations , 2002, Neural Computation.

[11]  Julian Eggert,et al.  Short Term Memory and Pattern Matching with Simple Echo State Networks , 2005, ICANN.

[12]  Jochen J. Steil,et al.  Intrinsic plasticity for reservoir learning algorithms , 2007, ESANN.

[13]  Robert A. Legenstein,et al.  At the Edge of Chaos: Real-time Computations and Self-Organized Criticality in Recurrent Neural Networks , 2004, NIPS.

[14]  Jochen J. Steil,et al.  Online stability of backpropagation-decorrelation recurrent learning , 2006, Neurocomputing.

[15]  Benjamin Schrauwen,et al.  Adapting reservoir states to get Gaussian distributions , 2007, ESANN.

[16]  Amir F. Atiya,et al.  New results on recurrent network training: unifying the algorithms and accelerating convergence , 2000, IEEE Trans. Neural Networks Learn. Syst..

[17]  B. Schrauwen,et al.  Isolated word recognition with the Liquid State Machine: a case study , 2005, Inf. Process. Lett..

[18]  Benjamin Schrauwen,et al.  Recognition of Isolated Digits using a Liquid State Machine , 2005 .

[19]  Christof Koch,et al.  How voltage-dependent conductances can adapt to maximize the information encoded by neuronal firing rate , 1999, Nature Neuroscience.

[20]  Jochen J. Steil,et al.  Analyzing the weight dynamics of recurrent learning algorithms , 2005, Neurocomputing.

[21]  Gordon Pipa,et al.  The combination of STDP and intrinsic plasticity yields complex dynamics in recurrent spiking networks , 2006, ESANN.

[22]  Herbert Jaeger,et al.  Adaptive Nonlinear System Identification with Echo State Networks , 2002, NIPS.

[23]  L. Abbott,et al.  Responses of neurons in primary and inferior temporal visual cortices to natural scenes , 1997, Proceedings of the Royal Society of London. Series B: Biological Sciences.

[24]  N. Ramnani The primate cortico-cerebellar system: anatomy and function , 2006, Nature Reviews Neuroscience.

[25]  Wolfgang Maass,et al.  Dynamics of information and emergent computation in generic neural microcircuit models , 2005, Neural Networks.

[26]  John G. Harris,et al.  Minimum mean squared error time series classification using an echo state network prediction model , 2006, 2006 IEEE International Symposium on Circuits and Systems.

[27]  J.J. Steil,et al.  Backpropagation-decorrelation: online recurrent learning with O(N) complexity , 2004, 2004 IEEE International Joint Conference on Neural Networks (IEEE Cat. No.04CH37541).

[28]  Jochen J. Steil,et al.  Memory in Backpropagation-Decorrelation O(N) Efficient Online Recurrent Learning , 2005, ICANN.

[29]  Peter Tiño,et al.  Recurrent Neural Networks with Small Weights Implement Definite Memory Machines , 2003, Neural Computation.

[30]  Eduardo D. Sontag,et al.  Principles of real-time computing with feedback applied to cortical microcircuit models , 2005, NIPS.

[31]  Benjamin Schrauwen,et al.  The unified Reservoir Computing concept and its digital hardware implementations , 2006 .

[32]  Deniz Erdogmus,et al.  Learning mappings in brain machine interfaces with echo state networks , 2005, Proceedings. (ICASSP '05). IEEE International Conference on Acoustics, Speech, and Signal Processing, 2005..

[33]  Jochen Triesch,et al.  Synergies between Intrinsic and Synaptic Plasticity in Individual Model Neurons , 2004, NIPS.