Unifying quality metrics for reservoir networks

Several metrics for the quality of reservoirs have been proposed and linked to reservoir performance in Echo State Networks and Liquid State Machines. A method to visualize the quality of a reservoir, called the separation ratio graph, is developed from these existing metrics leading to a generalized approach to measuring reservoir quality. Separation ratio provides a method for estimating the components of a reservoir's separation and visualizing the transition from stable to chaotic behavior. This new approach does not require a prior classification of input samples and can be applied to reservoirs trained with unsupervised learning. It can also be used to analyze systems made up of multiple reservoirs to determine performance between any two points in the system.

[1]  Dan Ventura,et al.  Improving liquid state machines through iterative refinement of the reservoir , 2010, Neurocomputing.

[2]  Robert A. Legenstein,et al.  2007 Special Issue: Edge of chaos and prediction of computational performance for neural circuit models , 2007 .

[3]  Nils Bertschinger,et al.  Real-Time Computation at the Edge of Chaos in Recurrent Neural Networks , 2004, Neural Computation.

[4]  Marc Schoenauer,et al.  Unsupervised learning of echo state networks: balancing the double pole , 2008, GECCO '08.

[5]  Yoshua Bengio,et al.  Scaling learning algorithms towards AI , 2007 .

[6]  Herbert Jaeger,et al.  A tutorial on training recurrent neural networks , covering BPPT , RTRL , EKF and the " echo state network " approach - Semantic Scholar , 2005 .

[7]  David A. Landgrebe,et al.  Analyzing high-dimensional multispectral data , 1993, IEEE Trans. Geosci. Remote. Sens..

[8]  Russell Y. Webb,et al.  MULTI-LAYER CORRECTIVE CASCADE ARCHITECTURE FOR ON-LINE PREDICTIVE ECHO STATE NETWORKS , 2008, Appl. Artif. Intell..

[9]  D. Landgrebe On Information Extraction Principles for Hyperspectral Data , 1997 .

[10]  Ganesh K. Venayagamoorthy,et al.  Effects of spectral radius and settling time in the performance of echo state networks , 2009, Neural Networks.

[11]  Dan Ventura,et al.  Spatiotemporal Pattern Recognition via Liquid State Machines , 2006, The 2006 IEEE International Joint Conference on Neural Network Proceedings.

[12]  Simon Haykin,et al.  Decoupled echo state networks with lateral inhibition , 2007, Neural Networks.

[13]  Marc Schoenauer,et al.  Unsupervised Learning of Echo State Networks: A Case Study in Artificial Embryogeny , 2007, Artificial Evolution.

[14]  Herbert Jaeger,et al.  Discovering multiscale dynamical features with hierarchical Echo State Networks , 2008 .

[15]  Dan Ventura,et al.  Improving the separability of a reservoir facilitates learning transfer , 2009, 2009 International Joint Conference on Neural Networks.

[16]  Herbert Jaeger,et al.  Optimization and applications of echo state networks with leaky- integrator neurons , 2007, Neural Networks.

[17]  Herbert Jaeger,et al.  Reservoir computing approaches to recurrent neural network training , 2009, Comput. Sci. Rev..

[18]  Benjamin Schrauwen,et al.  An experimental unification of reservoir computing methods , 2007, Neural Networks.

[19]  Benjamin Schrauwen,et al.  Unsupervised Learning in Reservoir Computing: Modeling Hippocampal Place Cells for Small Mobile Robots , 2009, ICANN.

[20]  M. Rosenstein,et al.  A practical method for calculating largest Lyapunov exponents from small data sets , 1993 .

[21]  D. Ventura,et al.  Effectively using recurrently-connected spiking neural networks , 2005, Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005..

[22]  Benjamin Schrauwen,et al.  On Computational Power and the Order-Chaos Phase Transition in Reservoir Computing , 2008, NIPS.