Sensitive Finite-State Computations Using a Distributed Network With a Noisy Network Attractor

We exhibit a class of smooth continuous-state neural-inspired networks composed of simple nonlinear elements that can be made to function as a finite-state computational machine. We give an explicit construction of arbitrary finite-state virtual machines in the spatiotemporal dynamics of the network. The dynamics of the functional network can be completely characterized as a “noisy network attractor” in phase space operating in either an “excitable” or a “free-running” regime, respectively, corresponding to excitable or heteroclinic connections between states. The regime depends on the sign of an “excitability parameter.” Viewing the network as a nonlinear stochastic differential equation where a deterministic (signal) and/or a stochastic (noise) input is applied to any element, we explore the influence of the signal-to-noise ratio on the error rate of the computations. The free-running regime is extremely sensitive to inputs: arbitrarily small amplitude perturbations can be used to perform computations with the system as long as the input dominates the noise. We find a counter-intuitive regime where increasing noise amplitude can lead to more, rather than less, accurate computation. We suggest that noisy network attractors will be useful for understanding neural networks that reliably and sensitively perform finite-state computations in a noisy environment.

[1]  Steve B. Furber,et al.  The SpiNNaker Project , 2014, Proceedings of the IEEE.

[2]  Fabio L. Traversa,et al.  Topological Field Theory and Computing with Instantons , 2016, ArXiv.

[3]  Daniel S. Graça,et al.  Computability with polynomial differential equations , 2008, Adv. Appl. Math..

[4]  Eugene Asarin,et al.  Noisy Turing Machines , 2005, ICALP.

[5]  W. Pitts,et al.  A Logical Calculus of the Ideas Immanent in Nervous Activity (1943) , 2021, Ideas That Created the Future.

[6]  Marc Timme,et al.  Computation by switching in complex networks of states. , 2012, Physical review letters.

[7]  Pablo Varona,et al.  Chunking dynamics: heteroclinics in mind , 2014, Front. Comput. Neurosci..

[8]  Shen Lin,et al.  Computer Studies of Turing Machine Problems , 1965, JACM.

[9]  Olivier Bournez,et al.  A Survey on Continuous Time Computations , 2009, ArXiv.

[10]  P. Ashwin,et al.  Discrete computation using a perturbed heteroclinic network , 2005 .

[11]  A. Selverston,et al.  Dynamical principles in neuroscience , 2006 .

[12]  Jürgen Schmidhuber,et al.  LSTM: A Search Space Odyssey , 2015, IEEE Transactions on Neural Networks and Learning Systems.

[13]  Hava T. Siegelmann,et al.  On the Computational Power of Neural Nets , 1995, J. Comput. Syst. Sci..

[14]  Lorenzo Livi,et al.  Determination of the Edge of Criticality in Echo State Networks Through Fisher Information Maximization , 2018, IEEE Transactions on Neural Networks and Learning Systems.

[15]  Hui Lu,et al.  The significance of dynamical architecture for adaptive responses to mechanical loads during rhythmic behavior , 2014, Journal of Computational Neuroscience.

[16]  H. Haken,et al.  Stochastic resonance without external periodic force. , 1993, Physical review letters.

[17]  José Carlos Príncipe,et al.  Special issue on echo state networks and liquid state machines , 2007, Neural Networks.

[18]  Lorenzo Livi,et al.  Investigating Echo-State Networks Dynamics by Means of Recurrence Analysis , 2018, IEEE Transactions on Neural Networks and Learning Systems.

[19]  Aaron Sloman,et al.  Virtual Machines and Consciousness , 2003 .

[20]  Ichiro Tsuda,et al.  Chaotic itinerancy and its roles in cognitive neurodynamics , 2015, Current Opinion in Neurobiology.

[21]  Michael A Schwemmer,et al.  Effects of moderate noise on a limit cycle oscillator: counterrotation and bistability. , 2014, Physical review letters.

[22]  Hava T Siegelmann,et al.  Turing on Super-Turing and adaptivity. , 2013, Progress in biophysics and molecular biology.

[23]  A. Turing On Computable Numbers, with an Application to the Entscheidungsproblem. , 1937 .

[24]  Heiner Marxen,et al.  Attacking the Busy Beaver 5 , 1990, Bull. EATCS.

[25]  Stephen Coombes,et al.  Mathematical Frameworks for Oscillatory Network Dynamics in Neuroscience , 2015, The Journal of Mathematical Neuroscience.

[26]  Peter Ashwin,et al.  On designing heteroclinic networks from graphs , 2013, 1302.0984.

[27]  Herbert Jaeger,et al.  Adaptive Nonlinear System Identification with Echo State Networks , 2002, NIPS.

[28]  Massimiliano Di Ventra,et al.  Polynomial-time solution of prime factorization and NP-hard problems with digital memcomputing machines , 2015, Chaos.

[29]  Jeffrey D. Ullman,et al.  Introduction to Automata Theory, Languages and Computation , 1979 .

[30]  Claire M. Postlethwaite,et al.  Quantifying Noisy Attractors: From Heteroclinic to Excitable Networks , 2016, SIAM J. Appl. Dyn. Syst..

[31]  M. J. Field,et al.  Heteroclinic Networks in Homogeneous and Heterogeneous Identical Cell Systems , 2015, Journal of Nonlinear Science.

[32]  Claire M. Postlethwaite,et al.  Designing Heteroclinic and Excitable Networks in Phase Space Using Two Populations of Coupled Cells , 2015, J. Nonlinear Sci..

[33]  Pekka Orponen,et al.  On the Effect of Analog Noise in Discrete-Time Analog Computations , 1996, Neural Computation.

[34]  Eduardo D. Sontag,et al.  Computational Aspects of Feedback in Neural Circuits , 2006, PLoS Comput. Biol..

[35]  Pekka Orponen,et al.  General-Purpose Computation with Neural Networks: A Survey of Complexity Theoretic Results , 2003, Neural Computation.

[36]  Eugene M. Izhikevich,et al.  Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting , 2006 .

[37]  Gábor Orosz,et al.  Dynamics on Networks of Cluster States for Globally Coupled Phase Oscillators , 2007, SIAM J. Appl. Dyn. Syst..