Reservoir Computing Trends

Reservoir Computing (RC) is a paradigm of understanding and training Recurrent Neural Networks (RNNs) based on treating the recurrent part (the reservoir) differently than the readouts from it. It started ten years ago and is currently a prolific research area, giving important insights into RNNs, practical machine learning tools, as well as enabling computation with non-conventional hardware. Here we give a brief introduction into basic concepts, methods, insights, current developments, and highlight some applications of RC.

[1]  Benjamin Schrauwen,et al.  Reservoir-based techniques for speech recognition , 2006, The 2006 IEEE International Joint Conference on Neural Network Proceedings.

[2]  Zhaoping Li,et al.  Psychophysical Tests of the Hypothesis of a Bottom-Up Saliency Map in Primary Visual Cortex , 2007, PLoS Comput. Biol..

[3]  Herbert Jaeger,et al.  Minimal Energy Control of an ESN Pattern Generator , 2011 .

[4]  L Pesquera,et al.  Photonic information processing beyond Turing: an optoelectronic implementation of reservoir computing. , 2012, Optics express.

[5]  Eduardo D. Sontag,et al.  Computational Aspects of Feedback in Neural Circuits , 2006, PLoS Comput. Biol..

[6]  Jean-Pierre Martens,et al.  Connected Digit Recognition by Means of Reservoir Computing , 2011, INTERSPEECH.

[7]  Mantas Lukosevicius,et al.  Reservoir Computing and Self-Organized Neural Hierarchies , 2012 .

[8]  David Verstraeten Reservoir Computing: computation with dynamical systems , 2009 .

[9]  H. Jaeger,et al.  Stepping forward through echoes of the past : forecasting with Echo State Networks , 2007 .

[10]  Peter Ford Dominey,et al.  Neural network processing of natural language: I. Sensitivity to serial, temporal and abstract structure of language in the infant , 2000 .

[11]  Ronald J. Williams,et al.  A Learning Algorithm for Continually Running Fully Recurrent Neural Networks , 1989, Neural Computation.

[12]  Harald Haas,et al.  Harnessing Nonlinearity: Predicting Chaotic Systems and Saving Energy in Wireless Communication , 2004, Science.

[13]  Johannes Schemmel,et al.  Edge of Chaos Computation in Mixed-Mode VLSI - A Hard Liquid , 2004, NIPS.

[14]  Benjamin Schrauwen,et al.  An uncued brain-computer interface using reservoir computing , 2010, NIPS 2010.

[15]  Chrisantha Fernando,et al.  Pattern Recognition in a Bucket , 2003, ECAL.

[16]  Hans Hallez,et al.  Automatic detection of epileptic seizures on the intra-cranial electroencephalogram of rats using reservoir computing , 2011, Artif. Intell. Medicine.

[17]  Herbert Jaeger,et al.  Reservoir computing approaches to recurrent neural network training , 2009, Comput. Sci. Rev..

[18]  Auke Jan Ijspeert,et al.  Central pattern generators for locomotion control in animals and robots: A review , 2008, Neural Networks.

[19]  J T Irving,et al.  The effect of dietary restriction upon the incisor teeth of rachitic rats , 1946, The Journal of physiology.

[20]  Herbert Jaeger,et al.  A tutorial on training recurrent neural networks , covering BPPT , RTRL , EKF and the " echo state network " approach - Semantic Scholar , 2005 .

[21]  Ilya Sutskever,et al.  Learning Recurrent Neural Networks with Hessian-Free Optimization , 2011, ICML.

[22]  L. F. Abbott,et al.  Generating Coherent Patterns of Activity from Chaotic Neural Networks , 2009, Neuron.

[23]  Paul-Gerhard Plöger,et al.  Echo State Networks used for Motor Control , 2005, Proceedings of the 2005 IEEE International Conference on Robotics and Automation.

[24]  B. Schrauwen,et al.  Isolated word recognition with the Liquid State Machine: a case study , 2005, Inf. Process. Lett..

[25]  Min Han,et al.  Support Vector Echo-State Machine for Chaotic Time-Series Prediction , 2007, IEEE Transactions on Neural Networks.

[26]  Jürgen Schmidhuber,et al.  Long Short-Term Memory , 1997, Neural Computation.

[27]  J.J. Steil,et al.  Backpropagation-decorrelation: online recurrent learning with O(N) complexity , 2004, 2004 IEEE International Joint Conference on Neural Networks (IEEE Cat. No.04CH37541).

[28]  Benjamin Schrauwen,et al.  Optoelectronic Reservoir Computing , 2011, Scientific Reports.

[29]  Benjamin Schrauwen,et al.  Phoneme Recognition with Large Hierarchical Reservoirs , 2010, NIPS.

[30]  Audrius V. Avizienis,et al.  Emergent Criticality in Complex Turing B‐Type Atomic Switch Networks , 2012, Advanced materials.

[31]  Peter Ford Dominey,et al.  A three-layered model of primate prefrontal cortex encodes identity and abstract categorical structure of behavioral sequences , 2011, Journal of Physiology-Paris.

[32]  Herbert Jaeger,et al.  Optimization and applications of echo state networks with leaky- integrator neurons , 2007, Neural Networks.

[33]  Jochen J. Steil,et al.  Analyzing the weight dynamics of recurrent learning algorithms , 2005, Neurocomputing.

[34]  Henry Markram,et al.  Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations , 2002, Neural Computation.

[35]  Pat Langley,et al.  Editorial: On Machine Learning , 1986, Machine Learning.

[36]  Benjamin Schrauwen,et al.  Toward optical signal processing using photonic reservoir computing. , 2008, Optics express.

[37]  Wolfgang Maass,et al.  Neural Dynamics as Sampling: A Model for Stochastic Computation in Recurrent Networks of Spiking Neurons , 2011, PLoS Comput. Biol..

[38]  Amir F. Atiya,et al.  New results on recurrent network training: unifying the algorithms and accelerating convergence , 2000, IEEE Trans. Neural Networks Learn. Syst..

[39]  H. Seo,et al.  A reservoir of time constants for memory traces in cortical neurons , 2011, Nature Neuroscience.

[40]  Paul J. Werbos,et al.  Backpropagation Through Time: What It Does and How to Do It , 1990, Proc. IEEE.

[41]  Benjamin Schrauwen,et al.  Compact hardware liquid state machines on FPGA for real-time speech recognition , 2008, Neural Networks.

[42]  Yoshua Bengio,et al.  Learning long-term dependencies with gradient descent is difficult , 1994, IEEE Trans. Neural Networks.

[43]  Mantas Lukoševičius,et al.  Time Warping Invariant Echo State Networks , 2006 .

[44]  Peter Ford Dominey From Sensorimotor Sequence to Grammatical Construction: Evidence from Simulation and Neurophysiology , 2005, Adapt. Behav..

[45]  Herbert Jaeger,et al.  The''echo state''approach to analysing and training recurrent neural networks , 2001 .

[46]  W. Maass,et al.  State-dependent computations: spatiotemporal processing in cortical networks , 2009, Nature Reviews Neuroscience.

[47]  S. Barry Cooper,et al.  Computability In Context: Computation and Logic in the Real World , 2009 .

[48]  Peter Schattner Automated Querying of Genome Databases , 2007, PLoS Comput. Biol..

[49]  Wolfgang Maass,et al.  A Reward-Modulated Hebbian Learning Rule Can Explain Experimentally Observed Network Reorganization in a Brain Control Task , 2010, The Journal of Neuroscience.

[50]  Benjamin Schrauwen,et al.  Recurrent Kernel Machines: Computing with Infinite Echo State Networks , 2012, Neural Computation.

[51]  Peter Ford Dominey Complex sensory-motor sequence learning based on recurrent state representation and reinforcement learning , 1995, Biological Cybernetics.

[52]  Benjamin Schrauwen,et al.  An experimental unification of reservoir computing methods , 2007, Neural Networks.

[53]  K. Doya,et al.  Bifurcations in the learning of recurrent neural networks , 1992, [Proceedings] 1992 IEEE International Symposium on Circuits and Systems.

[54]  John G. Harris,et al.  Automatic speech recognition using a predictive echo state network classifier , 2007, Neural Networks.