Information processing using a single dynamical node as complex system

Novel methods for information processing are highly desired in our information-driven society. Inspired by the brain's ability to process information, the recently introduced paradigm known as 'reservoir computing' shows that complex networks can efficiently perform computation. Here we introduce a novel architecture that reduces the usually required large number of elements to a single nonlinear node with delayed feedback. Through an electronic implementation, we experimentally and numerically demonstrate excellent performance in a speech recognition benchmark. Complementary numerical studies also show excellent performance for a time series prediction benchmark. These results prove that delay-dynamical systems, even in their simplest manifestation, can perform efficient information processing. This finding paves the way to feasible and resource-efficient technological implementations of reservoir computing.

[1]  Benjamin Schrauwen,et al.  Reservoir-based techniques for speech recognition , 2006, The 2006 IEEE International Joint Conference on Neural Network Proceedings.

[2]  Zhaoping Li,et al.  Psychophysical Tests of the Hypothesis of a Bottom-Up Saliency Map in Primary Visual Cortex , 2007, PLoS Comput. Biol..

[3]  M. Aizerman,et al.  Theoretical Foundations of the Potential Function Method in Pattern Recognition Learning , 1964 .

[4]  Vladimir N. Vapnik,et al.  The Nature of Statistical Learning Theory , 2000, Statistics for Engineering and Information Science.

[5]  Eduardo D. Sontag,et al.  Computational Aspects of Feedback in Neural Circuits , 2006, PLoS Comput. Biol..

[6]  Danko Nikolic,et al.  Temporal dynamics of information content carried by neurons in the primary visual cortex , 2006, NIPS.

[7]  Atsushi Uchida,et al.  Dual synchronization of chaos in Mackey-Glass electronic circuits with time-delayed feedback. , 2007, Physical review. E, Statistical, nonlinear, and soft matter physics.

[8]  Peter Tiño,et al.  Minimum Complexity Echo State Network , 2011, IEEE Transactions on Neural Networks.

[9]  Thomas M. Cover,et al.  Geometrical and Statistical Properties of Systems of Linear Inequalities with Applications in Pattern Recognition , 1965, IEEE Trans. Electron. Comput..

[10]  Kestutis Pyragas,et al.  An electronic analog of the Mackey-Glass system , 1995 .

[11]  Harald Haas,et al.  Harnessing Nonlinearity: Predicting Chaotic Systems and Saving Energy in Wireless Communication , 2004, Science.

[12]  Johannes Schemmel,et al.  Edge of Chaos Computation in Mixed-Mode VLSI - A Hard Liquid , 2004, NIPS.

[13]  Jochen J. Steil,et al.  Recent advances in efficient learning of recurrent networks , 2009, ESANN.

[14]  Paul Lamere,et al.  Sphinx-4: a flexible open source framework for speech recognition , 2004 .

[15]  Chrisantha Fernando,et al.  Pattern Recognition in a Bucket , 2003, ECAL.

[16]  Herbert Jaeger,et al.  Reservoir computing approaches to recurrent neural network training , 2009, Comput. Sci. Rev..

[17]  Vladimir Cherkassky,et al.  The Nature Of Statistical Learning Theory , 1997, IEEE Trans. Neural Networks.

[18]  Rose,et al.  Conjecture on the dimensions of chaotic attractors of delayed-feedback dynamical systems. , 1987, Physical review. A, General physics.

[19]  L. Glass,et al.  Oscillation and chaos in physiological control systems. , 1977, Science.

[20]  B. Schrauwen,et al.  Isolated word recognition with the Liquid State Machine: a case study , 2005, Inf. Process. Lett..

[21]  Gilles Laurent,et al.  Transient Dynamics for Neural Processing , 2008, Science.

[22]  G. R. Doddington,et al.  Computers: Speech recognition: Turning theory to practice: New ICs have brought the requisite computer power to speech technology; an evaluation of equipment shows where it stands today , 1981, IEEE Spectrum.

[23]  Herbert Jaeger,et al.  Optimization and applications of echo state networks with leaky- integrator neurons , 2007, Neural Networks.

[24]  Robert A. Legenstein,et al.  2007 Special Issue: Edge of chaos and prediction of computational performance for neural circuit models , 2007 .

[25]  Henry Markram,et al.  Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations , 2002, Neural Computation.

[26]  Benjamin Schrauwen,et al.  Toward optical signal processing using photonic reservoir computing. , 2008, Optics express.

[27]  Amir F. Atiya,et al.  New results on recurrent network training: unifying the algorithms and accelerating convergence , 2000, IEEE Trans. Neural Networks Learn. Syst..

[28]  Eckehard Schöll,et al.  Handbook of Chaos Control , 2007 .

[29]  W. Maass,et al.  State-dependent computations: spatiotemporal processing in cortical networks , 2009, Nature Reviews Neuroscience.

[30]  Meucci,et al.  Defects and spacelike properties of delayed dynamical systems. , 1994, Physical review letters.

[31]  Fischer,et al.  High-dimensional chaotic dynamics of an external cavity semiconductor laser. , 1994, Physical review letters.

[32]  K. Ikeda,et al.  High-dimensional chaotic behavior in systems with time-delayed feedback , 1987 .

[33]  I. Ial,et al.  Nature Communications , 2010, Nature Cell Biology.

[34]  Benjamin Schrauwen,et al.  An experimental unification of reservoir computing methods , 2007, Neural Networks.

[35]  Richard F. Lyon,et al.  A computational model of filtering, detection, and compression in the cochlea , 1982, ICASSP.

[36]  Herbert Jaeger,et al.  Adaptive Nonlinear System Identification with Echo State Networks , 2002, NIPS.