Collective dynamics of rate neurons for supervised learning in a reservoir computing system.

In this paper, we study collective dynamics of the network of rate neurons which constitute a central element of a reservoir computing system. The main objective of the paper is to identify the dynamic behaviors inside the reservoir underlying the performance of basic machine learning tasks, such as generating patterns with specified characteristics. We build a reservoir computing system which includes a reservoir-a network of interacting rate neurons-and an output element that generates a target signal. We study individual activities of interacting rate neurons, while implementing the task and analyze the impact of the dynamic parameter-a time constant-on the quality of implementation.

[1]  R. Brockett,et al.  Reservoir observers: Model-free inference of unmeasured variables in chaotic systems. , 2017, Chaos.

[2]  L. F. Abbott,et al.  full-FORCE: A target-based method for training recurrent networks , 2017, PloS one.

[3]  David Sussillo,et al.  Opening the Black Box: Low-Dimensional Dynamics in High-Dimensional Recurrent Neural Networks , 2013, Neural Computation.

[4]  Konrad P. Körding,et al.  Toward an Integration of Deep Learning and Neuroscience , 2016, bioRxiv.

[5]  Jürgen Schmidhuber,et al.  Deep learning in neural networks: An overview , 2014, Neural Networks.

[6]  Benjamin Schrauwen,et al.  Reservoir Computing Trends , 2012, KI - Künstliche Intelligenz.

[7]  R. Yuste From the neuron doctrine to neural networks , 2015, Nature Reviews Neuroscience.

[8]  L. F. Abbott,et al.  Generating Coherent Patterns of Activity from Chaotic Neural Networks , 2009, Neuron.

[9]  Jaideep Pathak,et al.  Using machine learning to replicate chaotic attractors and calculate Lyapunov exponents from data. , 2017, Chaos.

[10]  L. F. Abbott,et al.  Building functional networks of spiking model neurons , 2016, Nature Neuroscience.

[11]  Herbert Jaeger,et al.  Optimization and applications of echo state networks with leaky- integrator neurons , 2007, Neural Networks.

[12]  Vladimir I. Nekorkin,et al.  Adaptive dynamical networks , 2017 .

[13]  N. Sigala,et al.  Dynamic Coding for Cognitive Control in Prefrontal Cortex , 2013, Neuron.

[14]  Konrad P. Kording,et al.  Towards an integration of deep learning and neuroscience , 2016, bioRxiv.

[15]  Peter Tiño,et al.  Minimum Complexity Echo State Network , 2011, IEEE Transactions on Neural Networks.

[16]  Benjamin Schrauwen,et al.  An experimental unification of reservoir computing methods , 2007, Neural Networks.

[17]  Wofgang Maas,et al.  Networks of spiking neurons: the third generation of neural network models , 1997 .

[18]  Omri Barak,et al.  Recurrent neural networks as versatile tools of neuroscience research , 2017, Current Opinion in Neurobiology.

[19]  Ulrich Parlitz,et al.  Observing spatio-temporal dynamics of excitable media using reservoir computing. , 2018, Chaos.

[20]  Christopher Kim,et al.  Learning recurrent dynamics in spiking networks , 2018, bioRxiv.

[21]  Edward Ott,et al.  Attractor reconstruction by machine learning. , 2018, Chaos.

[22]  Serge Massar,et al.  Using a reservoir computer to learn chaotic attractors, with applications to chaos synchronisation and cryptography , 2018, Physical review. E.

[23]  Wilten Nicola,et al.  Supervised learning in spiking neural networks with FORCE training , 2016, Nature Communications.

[24]  O. Sporns Structure and function of complex brain networks , 2013, Dialogues in clinical neuroscience.

[25]  Herbert Jaeger,et al.  Reservoir computing approaches to recurrent neural network training , 2009, Comput. Sci. Rev..

[26]  Peter Ford Dominey,et al.  Reservoir Computing Properties of Neural Dynamics in Prefrontal Cortex , 2016, PLoS Comput. Biol..

[27]  W. Maass,et al.  State-dependent computations: spatiotemporal processing in cortical networks , 2009, Nature Reviews Neuroscience.

[28]  Henry Markram,et al.  Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations , 2002, Neural Computation.

[29]  O. Sporns,et al.  Organization, development and function of complex brain networks , 2004, Trends in Cognitive Sciences.

[30]  Jaideep Pathak,et al.  Model-Free Prediction of Large Spatiotemporally Chaotic Systems from Data: A Reservoir Computing Approach. , 2018, Physical review letters.

[31]  Michelle Girvan,et al.  Hybrid Forecasting of Chaotic Processes: Using Machine Learning in Conjunction with a Knowledge-Based Model , 2018, Chaos.

[32]  Geoffrey E. Hinton,et al.  Deep Learning , 2015, Nature.

[33]  Thomas L. Carroll,et al.  Using reservoir computers to distinguish chaotic signals , 2018, Physical Review E.

[34]  György Buzsáki,et al.  Neural Syntax: Cell Assemblies, Synapsembles, and Readers , 2010, Neuron.

[35]  Dean V. Buonomano,et al.  ROBUST TIMING AND MOTOR PATTERNS BY TAMING CHAOS IN RECURRENT NEURAL NETWORKS , 2012, Nature Neuroscience.

[36]  Harald Haas,et al.  Harnessing Nonlinearity: Predicting Chaotic Systems and Saving Energy in Wireless Communication , 2004, Science.

[37]  David Sussillo,et al.  Neural circuits as computational dynamical systems , 2014, Current Opinion in Neurobiology.