Critical echo state network dynamics by means of Fisher information maximization

The computational capability of an Echo State Network (ESN), expressed in terms of low prediction error and high short-term memory capacity, is maximized on the so-called “edge of criticality”. In this paper we present a novel, unsupervised approach to identify this edge and, accordingly, we determine hyperparameters configuration that maximize network performance. The proposed method is application-independent and stems from recent theoretical results consolidating the link between Fisher information and critical phase transitions. We show how to identify optimal ESN hyperparameters by relying only on the Fisher information matrix (FIM) estimated from the activations of hidden neurons. In order to take into account the particular input signal driving the network dynamics, we adopt a recently proposed non-parametric FIM estimator. Experimental results on a set of standard benchmarks are provided and discussed, demonstrating the validity of the proposed method.

[1]  Robert A. Legenstein,et al.  2007 Special Issue: Edge of chaos and prediction of computational performance for neural circuit models , 2007 .

[2]  David Verstraeten An experimental comparison of reservoir computing methods , 2006, NIPS 2006.

[3]  S. Massar,et al.  Mean Field Theory of Dynamical Systems Driven by External Signals , 2012, ArXiv.

[4]  Herbert Jaeger,et al.  Reservoir computing approaches to recurrent neural network training , 2009, Comput. Sci. Rev..

[5]  Oliver Obst,et al.  Guided Self-Organization of Input-Driven Recurrent Neural Networks , 2013, ArXiv.

[6]  S. Carpenter,et al.  Anticipating Critical Transitions , 2012, Science.

[7]  W. Bialek,et al.  Are Biological Systems Poised at Criticality? , 2010, 1012.2242.

[8]  Antonello Rizzi,et al.  Short-Term Electric Load Forecasting Using Echo State Networks and PCA Decomposition , 2015, IEEE Access.

[9]  Benjamin Schrauwen,et al.  On Computational Power and the Order-Chaos Phase Transition in Reservoir Computing , 2008, NIPS.

[10]  Jorge Hidalgo,et al.  Cooperation, competition and the emergence of criticality in communities of adaptive systems , 2015 .

[11]  Herbert Jaeger,et al.  Adaptive Nonlinear System Identification with Echo State Networks , 2002, NIPS.

[12]  I. Mastromatteo,et al.  On the criticality of inferred models , 2011, 1102.1624.

[13]  Dongming Xu,et al.  Direct adaptive control: an echo state network and genetic algorithm approach , 2005, Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005..

[14]  Lorenzo Livi,et al.  Investigating Echo-State Networks Dynamics by Means of Recurrence Analysis , 2018, IEEE Transactions on Neural Networks and Learning Systems.

[15]  Herbert Jaeger,et al.  Echo State Property Linked to an Input: Exploring a Fundamental Characteristic of Recurrent Neural Networks , 2013, Neural Computation.

[16]  Minoru Asada,et al.  Information processing in echo state networks at the edge of chaos , 2011, Theory in Biosciences.

[17]  Benjamin Schrauwen,et al.  On the Quantification of Dynamics in Reservoir Computing , 2009, ICANN.

[18]  José Carlos Príncipe,et al.  Analysis and Design of Echo State Networks , 2007, Neural Computation.

[19]  Pablo Zegers,et al.  Fisher Information Properties , 2015, Entropy.

[20]  X. R. Wang,et al.  Relating Fisher information to order parameters. , 2011, Physical review. E, Statistical, nonlinear, and soft matter physics.

[21]  Alfred O. Hero,et al.  Empirical Non-Parametric Estimation of the Fisher Information , 2014, IEEE Signal Processing Letters.

[22]  Benjamin Schrauwen,et al.  Recurrent Kernel Machines: Computing with Infinite Echo State Networks , 2012, Neural Computation.