Evolutionary strategy for simultaneous optimization of parameters, topology and reservoir weights in Echo State Networks

Reservoir Computing is a new paradigm in artificial recurrent neural network training. A reservoir is generated randomly and only a readout layer is training [1]. Its simplicity and ease of use, paired with its underlying computational power make it an ideal choice for many application domains, for example time-series prediction, speech recognition, noise modeling, dynamic pattern classification, reinforcement learning and language modeling. However it is necessary to adjust the parameters and the topology to create a “good” reservoir for a given application. This paper presents an original investigation of an evolutionary method for simultaneous optimization of parameters, topology and reservoir weights in Echo State Networks. Optimizing reservoirs is a challenge and several evolutionary strategies for optimizing reservoirs have been presented, generally using the idea of separating the topology and reservoir weights to reduce the search space [1]. Here we present a method to optimize everything in concert. The results of this method applied to two different time series are shown and conferred with previous works.

[1]  Radford M. Neal Pattern Recognition and Machine Learning , 2007, Technometrics.

[2]  T. van der Zant,et al.  Identification of motion with echo state network , 2004, Oceans '04 MTS/IEEE Techno-Ocean '04 (IEEE Cat. No.04CH37600).

[3]  Benjamin Schrauwen,et al.  Event detection and localization for small mobile robots using reservoir computing , 2008, Neural Networks.

[4]  Henry Markram,et al.  Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations , 2002, Neural Computation.

[5]  Benjamin Schrauwen,et al.  An experimental unification of reservoir computing methods , 2007, Neural Networks.

[6]  M. C. Ozturk,et al.  Computing with transiently stable states , 2005, Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005..

[7]  Herbert Jaeger,et al.  The''echo state''approach to analysing and training recurrent neural networks , 2001 .

[8]  Benjamin Schrauwen,et al.  On the Quantification of Dynamics in Reservoir Computing , 2009, ICANN.

[9]  Benjamin Schrauwen,et al.  The Introduction of Time-Scales in Reservoir Computing, Applied to Isolated Digits Recognition , 2007, ICANN.

[10]  Lutz Prechelt,et al.  A Set of Neural Network Benchmark Problems and Benchmarking Rules , 1994 .

[11]  John H. Holland,et al.  Adaptation in Natural and Artificial Systems: An Introductory Analysis with Applications to Biology, Control, and Artificial Intelligence , 1992 .

[12]  José Carlos Príncipe,et al.  Analysis and Design of Echo State Networks , 2007, Neural Computation.

[13]  Teresa Bernarda Ludermir,et al.  Genetic algorithm for reservoir computing optimization , 2009, 2009 International Joint Conference on Neural Networks.

[14]  Teresa Bernarda Ludermir,et al.  Investigating the use of Reservoir Computing for forecasting the hourly wind speed in short -term , 2008, 2008 IEEE International Joint Conference on Neural Networks (IEEE World Congress on Computational Intelligence).

[15]  Teresa Bernarda Ludermir,et al.  Using Reservoir Computing for Forecasting Time Series: Brazilian Case Study , 2008, 2008 Eighth International Conference on Hybrid Intelligent Systems.

[16]  Herbert Jaeger,et al.  Reservoir computing approaches to recurrent neural network training , 2009, Comput. Sci. Rev..

[17]  J.J. Steil,et al.  Backpropagation-decorrelation: online recurrent learning with O(N) complexity , 2004, 2004 IEEE International Joint Conference on Neural Networks (IEEE Cat. No.04CH37541).