Long Short-Term Memory in Echo State Networks: Details of a Simulation Study

Echo State Networks (ESNs) is an approach to design and train recurrent neural networks in supervised learning tasks. An important objective in many such tasks is to learn to exploit long-time dependencies in the processed signals ("long short-term memory" performance). Here we expose ESNs to a series of synthetic benchmark tasks that have been used in the literature to study the learnability of long-range temporal dependencies. This report provides all the detail necessary to replicate these experiments. It is intended to serve as the technical companion to a journal submission paper where the findings are analysed and compared to results obtained elsewhere with other learning paradigms.

[1]  Paul J. Werbos,et al.  Backpropagation Through Time: What It Does and How to Do It , 1990, Proc. IEEE.

[2]  Jürgen Schmidhuber,et al.  Long Short-Term Memory , 1997, Neural Computation.

[3]  Herbert Jaeger,et al.  The''echo state''approach to analysing and training recurrent neural networks , 2001 .

[4]  Henry Markram,et al.  Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations , 2002, Neural Computation.

[5]  Haim Sompolinsky,et al.  Short-term memory in orthogonal neural networks. , 2004, Physical review letters.

[6]  Douglas Eck,et al.  Can't Get You Out of My Head: A Connectionist Model of Cyclic Rehearsal , 2006, ZiF Workshop.

[7]  Herbert Jaeger,et al.  Optimization and applications of echo state networks with leaky- integrator neurons , 2007, Neural Networks.

[8]  Benjamin Schrauwen,et al.  Band-pass Reservoir Computing , 2008, 2008 IEEE International Joint Conference on Neural Networks (IEEE World Congress on Computational Intelligence).

[9]  Herbert Jaeger,et al.  Reservoir computing approaches to recurrent neural network training , 2009, Comput. Sci. Rev..

[10]  Benjamin Schrauwen,et al.  Memory in reservoirs for high dimensional input , 2010, The 2010 International Joint Conference on Neural Networks (IJCNN).

[11]  Benjamin Schrauwen,et al.  Memory in linear recurrent neural networks in continuous time , 2010, Neural Networks.

[12]  Ilya Sutskever,et al.  Learning Recurrent Neural Networks with Hessian-Free Optimization , 2011, ICML.

[13]  Razvan Pascanu,et al.  A neurodynamical model for working memory , 2011, Neural Networks.

[14]  Benjamin Schrauwen,et al.  Recurrent Kernel Machines: Computing with Infinite Echo State Networks , 2012, Neural Computation.