Reservoir Computing: Information Processing of Stationary Signals

This paper extends the notion of information processing capacity for non-independent input signals in the context of reservoir computing (RC). The presence of input autocorrelation makes worthwhile the treatment of forecasting and filtering problems for which we explicitly compute this generalized capacity as a function of the reservoir parameter values using a streamlined model. The reservoir model leading to these developments is used to show that, whenever that approximation is valid, this computational paradigm satisfies the so called separation and fading memory properties that are usually associated with good information processing performances. We show that several standard memory, forecasting, and filtering problems that appear in the parametric stochastic time series context can be readily formulated and tackled via RC which, as we show, significantly outperforms standard techniques in some instances.

[1]  Juan-Pablo Ortega,et al.  Quantitative evaluation of the performance of discrete-time reservoir computers in the forecasting, filtering, and reconstruction of stochastic stationary signals , 2015 .

[2]  Surya Ganguli,et al.  Memory traces in dynamical systems , 2008, Proceedings of the National Academy of Sciences.

[3]  J. Nelder,et al.  Double hierarchical generalized linear models (with discussion) , 2006 .

[4]  Henry Markram,et al.  Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations , 2002, Neural Computation.

[5]  Benjamin Schrauwen,et al.  Information Processing Capacity of Dynamical Systems , 2012, Scientific Reports.

[6]  Herbert Jaeger,et al.  Reservoir computing approaches to recurrent neural network training , 2009, Comput. Sci. Rev..

[7]  Woojoo Lee,et al.  The hierarchical-likelihood approach to autoregressive stochastic volatility models , 2011, Comput. Stat. Data Anal..

[8]  Chris L. S. Coryn,et al.  Short-term memory , 1975 .

[9]  Stephen L Taylor,et al.  Modelling Financial Time Series , 1987 .

[10]  José Manuel Gutiérrez,et al.  Memory and Nonlinear Mapping in Reservoir Computing with Two Uncoupled Nonlinear Delay Nodes , 2013 .

[11]  Benjamin Schrauwen,et al.  Memory in linear recurrent neural networks in continuous time , 2010, Neural Networks.

[12]  Stefan J. Kiebel,et al.  Re-visiting the echo state property , 2012, Neural Networks.

[13]  Wolfgang Maass,et al.  Liquid State Machines: Motivation, Theory, and Applications , 2010 .

[14]  Haim Sompolinsky,et al.  Short-term memory in orthogonal neural networks. , 2004, Physical review letters.

[15]  Herbert Jaeger,et al.  The''echo state''approach to analysing and training recurrent neural networks , 2001 .

[16]  Laurent Larger,et al.  Optimal nonlinear information processing capacity in delay-based reservoir computers , 2014, Scientific Reports.

[17]  Harald Haas,et al.  Harnessing Nonlinearity: Predicting Chaotic Systems and Saving Energy in Wireless Communication , 2004, Science.

[18]  Benjamin Schrauwen,et al.  An experimental unification of reservoir computing methods , 2007, Neural Networks.

[19]  Youngjo Lee,et al.  GLM-methods for volatility models , 2008 .

[20]  Nigel Crook Nonlinear transient computation , 2007, Neurocomputing.

[21]  Laurent Larger,et al.  Stochastic Nonlinear Time Series Forecasting Using Time-Delay Reservoir Computers: Performance and Universality , 2013, Neural Networks.

[22]  J. Nelder,et al.  Double hierarchical generalized linear models , 2006 .

[23]  N. Shephard,et al.  Multivariate stochastic variance models , 1994 .

[24]  Laurent Larger,et al.  Nonlinear Memory Capacity of Parallel Time-Delay Reservoir Computers in the Processing of Multidimensional Signals , 2015, Neural Computation.

[25]  Stephen L Taylor,et al.  Modelling Financial Time Series , 1987 .