Liquid Time-constant Recurrent Neural Networks as Universal Approximators

In this paper, we introduce the notion of liquid time-constant (LTC) recurrent neural networks (RNN)s, a subclass of continuous-time RNNs, with varying neuronal time-constant realized by their nonlinear synaptic transmission model. This feature is inspired by the communication principles in the nervous system of small species. It enables the model to approximate continuous mapping with a small number of computational units. We show that any finite trajectory of an $n$-dimensional continuous dynamical system can be approximated by the internal state of the hidden units and $n$ output units of an LTC network. Here, we also theoretically find bounds on their neuronal states and varying time-constant.

[1]  T. Sejnowski,et al.  Distributed processing of sensory information in the leech. III. A dynamical neural network model of the local bending reflex , 1992, The Journal of neuroscience : the official journal of the Society for Neuroscience.

[2]  C. Koch,et al.  Methods in Neuronal Modeling: From Ions to Networks , 1998 .

[3]  Radu Grosu,et al.  Probabilistic reachability analysis of the tap withdrawal circuit in caenorhabditis elegans , 2016, 2016 IEEE International High Level Design Validation and Test Workshop (HLDVT).

[4]  G. Lewicki,et al.  Approximation by Superpositions of a Sigmoidal Function , 2003 .

[5]  Hans-Georg Zimmermann,et al.  Recurrent Neural Networks Are Universal Approximators , 2006, ICANN.

[6]  Michael C. Mozer,et al.  Discrete Event, Continuous Time RNNs , 2017, ArXiv.

[7]  S. R. Wicks,et al.  A Dynamic Network Simulation of the Nematode Tap Withdrawal Circuit: Predictions Concerning Synaptic Function Using Behavioral Criteria , 1996, The Journal of Neuroscience.

[8]  Radu Grosu,et al.  Non-Associative Learning Representation in the Nervous System of the Nematode Caenorhabditis elegans , 2017, ArXiv.

[9]  Yuichi Nakamura,et al.  Approximation of dynamical systems by continuous time recurrent neural networks , 1993, Neural Networks.

[10]  Radu Grosu,et al.  c302: a multiscale framework for modelling the nervous system of Caenorhabditis elegans , 2018, Philosophical Transactions of the Royal Society B: Biological Sciences.

[11]  Richard Gordon,et al.  OpenWorm: overview and recent advances in integrative biological simulation of Caenorhabditis elegans , 2018, Philosophical Transactions of the Royal Society B.

[12]  Kurt Hornik,et al.  Multilayer feedforward networks are universal approximators , 1989, Neural Networks.

[13]  Re Davis,et al.  Signaling properties of Ascaris motorneurons: graded active responses, graded synaptic transmission, and tonic transmitter release , 1989, The Journal of neuroscience : the official journal of the Society for Neuroscience.

[14]  Radu Grosu,et al.  SIM-CE: An Advanced Simulink Platform for Studying the Brain of Caenorhabditis elegans , 2017, ArXiv.

[15]  Ken-ichi Funahashi,et al.  On the approximate realization of continuous mappings by neural networks , 1989, Neural Networks.