An overview of reservoir computing: theory, applications and implementations

Training recurrent neural networks is hard. Recently it has however been discovered that it is possible to just construct a random recurrent topology, and only train a single linear readout layer. State-of- the-art performance can easily be achieved with this setup, called Reservoir Computing. The idea can even be broadened by stating that any high di- mensional, driven dynamic system, operated in the correct dynamic regime can be used as a temporal 'kernel' which makes it possible to solve complex tasks using just linear post-processing techniques. This tutorial will give an overview of current research on theory, applica- tion and implementations of Reservoir Computing.

[1]  P. Werbos,et al.  Beyond Regression : "New Tools for Prediction and Analysis in the Behavioral Sciences , 1974 .

[2]  F. Takens Detecting strange attractors in turbulence , 1981 .

[3]  J J Hopfield,et al.  Neural networks and physical systems with emergent collective computational abilities. , 1982, Proceedings of the National Academy of Sciences of the United States of America.

[4]  Geoffrey E. Hinton,et al.  Learning internal representations by error propagation , 1986 .

[5]  Paul J. Werbos,et al.  Backpropagation Through Time: What It Does and How to Do It , 1990, Proc. IEEE.

[6]  Yuichi Nakamura,et al.  Approximation of dynamical systems by continuous time recurrent neural networks , 1993, Neural Networks.

[7]  Lee A. Feldkamp,et al.  Neurocontrol of nonlinear dynamical systems with Kalman filter trained recurrent networks , 1994, IEEE Trans. Neural Networks.

[8]  C. L. Giles,et al.  Constructing deterministic finite-state automata in sparse recurrent neural networks , 1994, Proceedings of 1994 IEEE International Conference on Neural Networks (ICNN'94).

[9]  Anthony J. Robinson,et al.  An application of recurrent nets to phone probability estimation , 1994, IEEE Trans. Neural Networks.

[10]  Johan A. K. Suykens,et al.  Artificial Neural Networks for Modeling and Control of Non-Linear Systems , 1995 .

[11]  M M Merzenich,et al.  Temporal information transformed into a spatial code by a neural network with realistic properties , 1995, Science.

[12]  Hava T. Siegelmann,et al.  The Dynamic Universality of Sigmoidal Neural Networks , 1996, Inf. Comput..

[13]  Jürgen Schmidhuber,et al.  Long Short-Term Memory , 1997, Neural Computation.

[14]  Amir F. Atiya,et al.  New results on recurrent network training: unifying the algorithms and accelerating convergence , 2000, IEEE Trans. Neural Networks Learn. Syst..

[15]  Herbert Jaeger,et al.  The''echo state''approach to analysing and training recurrent neural networks , 2001 .

[16]  Paul Rodríguez,et al.  Simple Recurrent Networks Learn Context-Free and Context-Sensitive Languages by Counting , 2001, Neural Computation.

[17]  Jürgen Schmidhuber,et al.  LSTM recurrent networks learn simple context-free and context-sensitive languages , 2001, IEEE Trans. Neural Networks.

[18]  Joachim Hertzberg,et al.  Learning to Ground Fact Symbols in Behavior-Based Robots , 2002, ECAI.

[19]  Herbert Jaeger,et al.  Adaptive Nonlinear System Identification with Echo State Networks , 2002, NIPS.

[20]  Jochen J. Steil,et al.  Perspectives on learning with recurrent neural networks , 2002, ESANN.

[21]  Henry Markram,et al.  A New Approach towards Vision Suggested by Biologically Realistic Neural Microcircuit Models , 2002, Biologically Motivated Computer Vision.

[22]  Henry Markram,et al.  Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations , 2002, Neural Computation.

[23]  Henry Markram,et al.  A Model for Real-Time Computation in Generic Neural Microcircuits , 2002, NIPS.

[24]  Paul-Gerhard Plöger,et al.  Echo State Networks for Mobile Robot Modeling and Control , 2003, RoboCup.

[25]  Anthony Widjaja,et al.  Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond , 2003, IEEE Transactions on Neural Networks.

[26]  Chrisantha Fernando,et al.  Pattern Recognition in a Bucket , 2003, ECAL.

[27]  D. Linden,et al.  The other side of the engram: experience-driven changes in neuronal intrinsic excitability , 2003, Nature Reviews Neuroscience.

[28]  Harald Haas,et al.  Harnessing Nonlinearity: Predicting Chaotic Systems and Saving Energy in Wireless Communication , 2004, Science.

[29]  Xianhua Dai,et al.  Genetic Regulatory Systems Modeled by Recurrent Neural Network , 2004, ISNN.

[30]  Henry Markram,et al.  Fading memory and kernel properties of generic cortical microcircuit models , 2004, Journal of Physiology-Paris.

[31]  Jochen Triesch,et al.  Synergies between Intrinsic and Synaptic Plasticity in Individual Model Neurons , 2004, NIPS.

[32]  Wolfgang Maass,et al.  Movement Generation and Control with Generic Neural Microcircuits , 2004, BioADIT.

[33]  Jürgen Schmidhuber,et al.  Biologically Plausible Speech Recognition with LSTM Neural Nets , 2004, BioADIT.

[34]  J.J. Steil,et al.  Backpropagation-decorrelation: online recurrent learning with O(N) complexity , 2004, 2004 IEEE International Joint Conference on Neural Networks (IEEE Cat. No.04CH37541).

[35]  Johannes Schemmel,et al.  Edge of Chaos Computation in Mixed-Mode VLSI - A Hard Liquid , 2004, NIPS.

[36]  Tijn van der Zant,et al.  Finding good Echo State Networks to control an underwater robot using evolutionary computations , 2004 .

[37]  C. Anderson,et al.  Modeling reward functions for incomplete state representations via echo state networks , 2005, Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005..

[38]  M. C. Ozturk,et al.  Computing with transiently stable states , 2005, Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005..

[39]  Deniz Erdogmus,et al.  Learning mappings in brain machine interfaces with echo state networks , 2005, Proceedings. (ICASSP '05). IEEE International Conference on Acoustics, Speech, and Signal Processing, 2005..

[40]  Michael Schanz,et al.  Velocity Control of an Omnidirectional RoboCup Player with Recurrent Neural Networks , 2005, RoboCup.

[41]  B. Schrauwen,et al.  Isolated word recognition with the Liquid State Machine: a case study , 2005, Inf. Process. Lett..

[42]  Harald Burgsteiner,et al.  ON LEARNING WITH RECURRENT SPIKING NEURAL NETWORKS AND THEIR APPLICATIONS TO ROBOT CONTROL WITH REAL-WORLD DEVICES , 2005 .

[43]  Paul-Gerhard Plöger,et al.  Echo State Networks used for Motor Control , 2005, Proceedings of the 2005 IEEE International Conference on Robotics and Automation.

[44]  H. Jaeger,et al.  Reservoir riddles: suggestions for echo state network research , 2005, Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005..

[45]  Jochen J. Steil,et al.  Memory in Backpropagation-Decorrelation O(N) Efficient Online Recurrent Learning , 2005, ICANN.

[46]  Julian Eggert,et al.  Short Term Memory and Pattern Matching with Simple Echo State Networks , 2005, ICANN.

[47]  Benjamin Schrauwen,et al.  Reservoir Computing with Stochastic Bitstream Neurons , 2005 .

[48]  Peter Michael Young,et al.  A tighter bound for the echo state property , 2006, IEEE Transactions on Neural Networks.

[49]  Danko Nikolic,et al.  Temporal dynamics of information content carried by neurons in the primary visual cortex , 2006, NIPS.

[50]  Jochen J. Steil,et al.  Online stability of backpropagation-decorrelation recurrent learning , 2006, Neurocomputing.

[51]  John G. Harris,et al.  Minimum mean squared error time series classification using an echo state network prediction model , 2006, 2006 IEEE International Symposium on Circuits and Systems.

[52]  José Carlos Príncipe,et al.  Analysis and Design of Echo State Networks , 2007, Neural Computation.

[53]  Wolfgang Maass,et al.  Cerebral Cortex Advance Access published February 15, 2006 A Statistical Analysis of Information- Processing Properties of Lamina-Specific , 2022 .

[54]  Benjamin Schrauwen,et al.  An experimental unification of reservoir computing methods , 2007, Neural Networks.

[55]  Terrence J. Sejnowski,et al.  What Makes a Dynamical System Computationally Powerful , 2007 .

[56]  Jürgen Schmidhuber,et al.  Training Recurrent Networks by Evolino , 2007, Neural Computation.

[57]  Jan Wessnitzer,et al.  ESANN'2007 proceedings - European Symposium on Artificial Neural Networks , 2007 .

[58]  Robert A. Legenstein,et al.  2007 Special Issue: Edge of chaos and prediction of computational performance for neural circuit models , 2007 .

[59]  Herbert Jaeger,et al.  Optimization and applications of echo state networks with leaky- integrator neurons , 2007, Neural Networks.

[60]  Ben Jones,et al.  Is there a Liquid State Machine in the Bacterium Escherichia Coli? , 2007, 2007 IEEE Symposium on Artificial Life.

[61]  Benjamin Schrauwen,et al.  Compact hardware for real-time speech recognition using a Liquid State Machine , 2007, 2007 International Joint Conference on Neural Networks.

[62]  Jochen J. Steil,et al.  Online reservoir adaptation by intrinsic plasticity for backpropagation-decorrelation and echo state learning , 2007, Neural Networks.

[63]  Simon Haykin,et al.  Decoupled echo state networks with lateral inhibition , 2007, Neural Networks.