A comparative study of Reservoir Computing strategies for monthly time series prediction

A good prediction of the future enables companies and governments to plan their investments, production and other needs. The demand for good forecasting techniques motivates many researchers coming from a wide variety of fields to develop methods for time series prediction. Many of these techniques are very complex to apply and demand lots of computational effort to execute. As an answer to this, we propose the use of Reservoir Computing, a recently developed technique for efficient training of recurrent neural networks, for monthly time series prediction. We will explain how Reservoir Computing in its basic form can be applied to time series prediction. Additionally we will extend this approach with different Reservoir Computing strategies such as seasonal adjustment or a Reservoir Computing based voting collective approach. We will investigate the performance of all the proposed strategies and compare its prediction accuracy with the linear forecasting procedure build in the Census Bureau's X-12-ARIMA program and a Nonlinear Autoregressive model using Least-Squares Support Vector Machines.

[1]  Benjamin Schrauwen,et al.  The Introduction of Time-Scales in Reservoir Computing, Applied to Isolated Digits Recognition , 2007, ICANN.

[2]  Andreas S. Weigend,et al.  Time Series Prediction: Forecasting the Future and Understanding the Past , 1994 .

[3]  Peter R. Winters,et al.  Forecasting Sales by Exponentially Weighted Moving Averages , 1960 .

[4]  G. G. Stokes "J." , 1890, The New Yale Book of Quotations.

[5]  S. Quartz,et al.  Getting to Know You: Reputation and Trust in a Two-Person Economic Exchange , 2005, Science.

[6]  Harald Haas,et al.  Harnessing Nonlinearity: Predicting Chaotic Systems and Saving Energy in Wireless Communication , 2004, Science.

[7]  Peter J. Danaher,et al.  Principles of forecasting , 2001 .

[8]  Benjamin Schrauwen,et al.  On Computational Power and the Order-Chaos Phase Transition in Reservoir Computing , 2008, NIPS.

[9]  P. Young,et al.  Time series analysis, forecasting and control , 1972, IEEE Transactions on Automatic Control.

[10]  José Carlos Príncipe,et al.  Analysis and Design of Echo State Networks , 2007, Neural Computation.

[11]  B. Schrauwen,et al.  Isolated word recognition with the Liquid State Machine: a case study , 2005, Inf. Process. Lett..

[12]  Guoqiang Peter Zhang,et al.  Neural network forecasting for seasonal and trend time series , 2005, Eur. J. Oper. Res..

[13]  A. E. Hoerl,et al.  Ridge Regression: Applications to Nonorthogonal Problems , 1970 .

[14]  Herbert Jaeger,et al.  Optimization and applications of echo state networks with leaky- integrator neurons , 2007, Neural Networks.

[15]  A. Weigend,et al.  Time Series Prediction: Forecasting the Future and Understanding the Past , 1994 .

[16]  Hendrik Van Brussel,et al.  Pruning and regularization in reservoir computing , 2009, Neurocomputing.

[17]  John G. Harris,et al.  Automatic speech recognition using a predictive echo state network classifier , 2007, Neural Networks.

[18]  Jouko Lampinen,et al.  European Symposium on Time Series Prediction (ESTSP´07) , 2007 .

[19]  Johan A. K. Suykens,et al.  WINNING ENTRY OF THE K. U. LEUVEN TIME-SERIES PREDICTION COMPETITION , 1999 .

[20]  Benjamin Schrauwen,et al.  Photonic Reservoir Computing with Coupled Semiconductor Optical Amplifiers , 2008, OSC.

[21]  Benjamin Schrauwen,et al.  An experimental unification of reservoir computing methods , 2007, Neural Networks.

[22]  Erkki Oja,et al.  Time series prediction competition: The CATS benchmark , 2007, Neurocomputing.

[23]  Benjamin Schrauwen,et al.  On the Quantification of Dynamics in Reservoir Computing , 2009, ICANN.

[24]  David F. Findley,et al.  New Capabilities and Methods of the X-12-ARIMA Seasonal-Adjustment Program , 1998 .

[25]  Eric A. Wan,et al.  Time series prediction by using a connectionist network with internal delay lines , 1993 .

[26]  Henry Markram,et al.  Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations , 2002, Neural Computation.

[27]  R. Shah,et al.  Least Squares Support Vector Machines , 2022 .

[28]  Johan A. K. Suykens,et al.  Time Series Prediction Competition , 1999 .

[29]  Robert A. Legenstein,et al.  2007 Special Issue: Edge of chaos and prediction of computational performance for neural circuit models , 2007 .

[30]  Benjamin Schrauwen,et al.  Event detection and localization for small mobile robots using reservoir computing , 2008, Neural Networks.

[31]  S. Coombes,et al.  Bumps, breathers, and waves in a neural network with spike frequency adaptation. , 2005, Physical review letters.

[32]  Francis wyffels,et al.  Using reservoir computing in a decomposition approach for time series prediction , 2008 .

[33]  Benjamin Schrauwen,et al.  Band-pass Reservoir Computing , 2008, 2008 IEEE International Joint Conference on Neural Networks (IEEE World Congress on Computational Intelligence).

[34]  G. Zhang,et al.  A comparative study of linear and nonlinear models for aggregate retail sales forecasting , 2003 .

[35]  George E. P. Box,et al.  Time Series Analysis: Forecasting and Control , 1977 .

[36]  Gunnar Rätsch,et al.  Predicting Time Series with Support Vector Machines , 1997, ICANN.

[37]  Nian Zhang,et al.  Time series prediction with recurrent neural networks trained by a hybrid PSO-EA algorithm , 2004, Neurocomputing.

[38]  Arthur E. Hoerl,et al.  Ridge Regression: Biased Estimation for Nonorthogonal Problems , 2000, Technometrics.

[39]  Benjamin Schrauwen,et al.  Stable Output Feedback in Reservoir Computing Using Ridge Regression , 2008, ICANN.

[40]  Herbert Jaeger,et al.  The''echo state''approach to analysing and training recurrent neural networks , 2001 .

[41]  Haim Sompolinsky,et al.  Short-term memory in orthogonal neural networks. , 2004, Physical review letters.