On the Correlation between Reservoir Metrics and Performance for Time Series Classification under the Influence of Synaptic Plasticity

Reservoir computing provides a simpler paradigm of training recurrent networks by initialising and adapting the recurrent connections separately to a supervised linear readout. This creates a problem, though. As the recurrent weights and topology are now separated from adapting to the task, there is a burden on the reservoir designer to construct an effective network that happens to produce state vectors that can be mapped linearly into the desired outputs. Guidance in forming a reservoir can be through the use of some established metrics which link a number of theoretical properties of the reservoir computing paradigm to quantitative measures that can be used to evaluate the effectiveness of a given design. We provide a comprehensive empirical study of four metrics; class separation, kernel quality, Lyapunov's exponent and spectral radius. These metrics are each compared over a number of repeated runs, for different reservoir computing set-ups that include three types of network topology and three mechanisms of weight adaptation through synaptic plasticity. Each combination of these methods is tested on two time-series classification problems. We find that the two metrics that correlate most strongly with the classification performance are Lyapunov's exponent and kernel quality. It is also evident in the comparisons that these two metrics both measure a similar property of the reservoir dynamics. We also find that class separation and spectral radius are both less reliable and less effective in predicting performance.

[1]  Thomas E. Gibbons Unifying quality metrics for reservoir networks , 2010, The 2010 International Joint Conference on Neural Networks (IJCNN).

[2]  S. Wang,et al.  Malleability of Spike-Timing-Dependent Plasticity at the CA3–CA1 Synapse , 2006, The Journal of Neuroscience.

[3]  E. Capaldi,et al.  The organization of behavior. , 1992, Journal of applied behavior analysis.

[4]  H. Markram,et al.  Regulation of Synaptic Efficacy by Coincidence of Postsynaptic APs and EPSPs , 1997, Science.

[5]  E. Bienenstock,et al.  Theory for the development of neuron selectivity: orientation specificity and binocular interaction in visual cortex , 1982, The Journal of neuroscience : the official journal of the Society for Neuroscience.

[6]  Benjamin Schrauwen,et al.  Reservoir Computing Trends , 2012, KI - Künstliche Intelligenz.

[7]  Henry Markram,et al.  Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations , 2002, Neural Computation.

[8]  Albert,et al.  Emergence of scaling in random networks , 1999, Science.

[9]  Herbert Jaeger,et al.  Reservoir computing approaches to recurrent neural network training , 2009, Comput. Sci. Rev..

[10]  Herbert Jaeger,et al.  Discovering multiscale dynamical features with hierarchical Echo State Networks , 2008 .

[11]  Robert A. Legenstein,et al.  2007 Special Issue: Edge of chaos and prediction of computational performance for neural circuit models , 2007 .

[12]  Dan Ventura,et al.  Improving liquid state machines through iterative refinement of the reservoir , 2010, Neurocomputing.

[13]  Yaochu Jin,et al.  Computational modeling of neural plasticity for self-organization of neural networks , 2014, Biosyst..

[14]  Nils Bertschinger,et al.  Real-Time Computation at the Edge of Chaos in Recurrent Neural Networks , 2004, Neural Computation.

[15]  G. Bi,et al.  Synaptic Modifications in Cultured Hippocampal Neurons: Dependence on Spike Timing, Synaptic Strength, and Postsynaptic Cell Type , 1998, The Journal of Neuroscience.

[16]  Herbert Jaeger,et al.  The''echo state''approach to analysing and training recurrent neural networks , 2001 .

[17]  Álvaro Herrero,et al.  International Joint Conference - CISIS'15 and ICEUTE'15, 8th International Conference on Computational Intelligence in Security for Information Systems / 6th International Conference on EUropean Transnational Education, Burgos, Spain, 15-17 June, 2015 , 2015, CISIS-ICEUTE.

[18]  Jochen Triesch,et al.  Optimizing Generic Neural Microcircuits through Reward Modulated STDP , 2009, ICANN.

[19]  Yan Meng,et al.  > Replace This Line with Your Paper Identification Number (double-click Here to Edit) < 1 , 2022 .

[20]  Yaochu Jin,et al.  The emergence of polychronous groups under varying input patterns, plasticity rules and network connectivities , 2012, The 2012 International Joint Conference on Neural Networks (IJCNN).

[21]  Mineichi Kudo,et al.  Multidimensional curve classification using passing-through regions , 1999, Pattern Recognit. Lett..

[22]  L. Abbott,et al.  Competitive Hebbian learning through spike-timing-dependent synaptic plasticity , 2000, Nature Neuroscience.

[23]  H. Jaeger,et al.  Reservoir riddles: suggestions for echo state network research , 2005, Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005..

[24]  Dan Ventura,et al.  Spatiotemporal Pattern Recognition via Liquid State Machines , 2006, The 2006 IEEE International Joint Conference on Neural Network Proceedings.

[25]  Dan Ventura,et al.  Preparing More Effective Liquid State Machines Using Hebbian Learning , 2006, The 2006 IEEE International Joint Conference on Neural Network Proceedings.

[26]  Alan M. Frieze,et al.  Random graphs , 2006, SODA '06.

[27]  VerstraetenD.,et al.  2007 Special Issue , 2007 .

[28]  Benjamin Schrauwen,et al.  An experimental unification of reservoir computing methods , 2007, Neural Networks.

[29]  Fangzheng Xue,et al.  Computational capability of liquid state machines with spike-timing-dependent plasticity , 2013, Neurocomputing.

[30]  Eugene M. Izhikevich,et al.  Simple model of spiking neurons , 2003, IEEE Trans. Neural Networks.