Takens-inspired neuromorphic processor: a downsizing tool for random recurrent neural networks via feature extraction

We describe a new technique which minimizes the amount of neurons in the hidden layer of a random recurrent neural network (rRNN) for time series prediction. Merging Takens-based attractor reconstruction methods with machine learning, we identify a mechanism for feature extraction that can be leveraged to lower the network size. We obtain criteria specific to the particular prediction task and derive the scaling law of the prediction error. The consequences of our theory are demonstrated by designing a Takens-inspired hybrid processor, which extends a rRNN with a priori designed delay external memory. Our hybrid architecture is therefore designed including both, real and virtual nodes. Via this symbiosis, we show performance of the hybrid processor by stabilizing an arrhythmic neural model. Thanks to our obtained design rules, we can reduce the stabilizing neural network's size by a factor of 15 with respect to a standard system.

[1]  Michael Elad,et al.  From Sparse Solutions of Systems of Equations to Sparse Modeling of Signals and Images , 2009, SIAM Rev..

[2]  Sanjoy Dasgupta,et al.  An elementary proof of a theorem of Johnson and Lindenstrauss , 2003, Random Struct. Algorithms.

[3]  James P. Crutchfield,et al.  Geometry from a Time Series , 1980 .

[4]  W. Marsden I and J , 2012 .

[5]  E. Ott Chaos in Dynamical Systems: Contents , 1993 .

[6]  E. Lorenz Deterministic nonperiodic flow , 1963 .

[7]  H. Sompolinsky,et al.  Compressed sensing, sparsity, and dimensionality in neuronal information processing and data analysis. , 2012, Annual review of neuroscience.

[8]  S. Yoshizawa,et al.  An Active Pulse Transmission Line Simulating Nerve Axon , 1962, Proceedings of the IRE.

[9]  Serge Massar,et al.  Brain-Inspired Photonic Signal Processor for Generating Periodic Patterns and Emulating Chaotic Systems , 2017 .

[10]  D. Sivakumar Algorithmic derandomization via complexity theory , 2002, STOC '02.

[11]  Farmer,et al.  Predicting chaotic time series. , 1987, Physical review letters.

[12]  R. FitzHugh Mathematical models of threshold phenomena in the nerve membrane , 1955 .

[13]  Ericka Stricklin-Parker,et al.  Ann , 2005 .

[14]  Yoshua Bengio,et al.  Gradient-based learning applied to document recognition , 1998, Proc. IEEE.

[15]  Collins,et al.  Controlling nonchaotic neuronal noise using chaos control techniques. , 1995, Physical review letters.

[16]  L. Appeltant,et al.  Information processing using a single dynamical node as complex system , 2011, Nature communications.

[17]  Peter Frankl,et al.  The Johnson-Lindenstrauss lemma and the sphericity of some graphs , 1987, J. Comb. Theory B.

[18]  R. FitzHugh Impulses and Physiological States in Theoretical Models of Nerve Membrane. , 1961, Biophysical journal.

[19]  W. Ditto,et al.  Controlling chaos in the brain , 1994, Nature.

[20]  Geoffrey E. Hinton,et al.  Deep Learning , 2015, Nature.

[21]  F. Takens Detecting strange attractors in turbulence , 1981 .

[22]  D. Ruelle,et al.  Ergodic theory of chaos and strange attractors , 1985 .

[23]  Patrick van der Smagt,et al.  Introduction to neural networks , 1995, The Lancet.

[24]  Sergio Gomez Colmenarejo,et al.  Hybrid computing using a neural network with dynamic external memory , 2016, Nature.

[25]  Gennadii Demidenko,et al.  On the relationship between solutions of delay differential equations and infinite-dimensional systems of differential equations , 2009 .

[26]  A. Longtin Stochastic resonance in neuron models , 1993 .

[27]  J. Tenenbaum,et al.  A global geometric framework for nonlinear dimensionality reduction. , 2000, Science.

[28]  Raúl Rojas,et al.  Neural Networks - A Systematic Introduction , 1996 .

[29]  Mohammad Hossein Fazel Zarandi,et al.  A new fuzzy functions model tuned by hybridizing imperialist competitive algorithm and simulated annealing. Application: Stock price prediction , 2013, Inf. Sci..

[30]  Laurent Larger,et al.  Dynamical complexity and computation in recurrent neural networks beyond their fixed point , 2018, Scientific Reports.

[31]  Wolfgang Maass,et al.  Searching for principles of brain computation , 2016, Current Opinion in Behavioral Sciences.

[32]  Andreas S. Weigend,et al.  Time Series Prediction: Forecasting the Future and Understanding the Past , 1994 .

[33]  Piotr Indyk,et al.  Approximate nearest neighbors: towards removing the curse of dimensionality , 1998, STOC '98.

[34]  Ali Rana Atilgan,et al.  State-space prediction model for chaotic time series , 1998 .

[35]  O. Lingjærde,et al.  Regularized local linear prediction of chaotic time series , 1998 .

[36]  Geoffrey E. Hinton,et al.  Visualizing Data using t-SNE , 2008 .

[37]  Harald Haas,et al.  Harnessing Nonlinearity: Predicting Chaotic Systems and Saving Energy in Wireless Communication , 2004, Science.

[38]  A Garfinkel,et al.  Controlling cardiac chaos. , 1992, Science.

[39]  Henry Markram,et al.  Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations , 2002, Neural Computation.

[40]  Christopher G. Langton,et al.  Computation at the edge of chaos: Phase transitions and emergent computation , 1990 .

[41]  H. Sompolinsky,et al.  Sparseness and Expansion in Sensory Representations , 2014, Neuron.

[42]  H. Kantz,et al.  Nonlinear time series analysis , 1997 .

[43]  L. Glass,et al.  Oscillation and chaos in physiological control systems. , 1977, Science.

[44]  Fredric M. Wolf,et al.  Random Wiring, Ganglion Cell Mosaics, and the Functional Architecture of the Visual Cortex , 2015, PLoS Comput. Biol..

[45]  Jürgen Kurths,et al.  Synchronization: Phase locking and frequency entrainment , 2001 .

[46]  I. Burhan Türksen,et al.  Fuzzy functions with support vector machines , 2007, Inf. Sci..