Memory traces in dynamical systems

To perform nontrivial, real-time computations on a sensory input stream, biological systems must retain a short-term memory trace of their recent inputs. It has been proposed that generic high-dimensional dynamical systems could retain a memory trace for past inputs in their current state. This raises important questions about the fundamental limits of such memory traces and the properties required of dynamical systems to achieve these limits. We address these issues by applying Fisher information theory to dynamical systems driven by time-dependent signals corrupted by noise. We introduce the Fisher Memory Curve (FMC) as a measure of the signal-to-noise ratio (SNR) embedded in the dynamical state relative to the input SNR. The integrated FMC indicates the total memory capacity. We apply this theory to linear neuronal networks and show that the capacity of networks with normal connectivity matrices is exactly 1 and that of any network of N neurons is, at most, N. A nonnormal network achieving this bound is subject to stringent design constraints: It must have a hidden feedforward architecture that superlinearly amplifies its input for a time of order N, and the input connectivity must optimally match this architecture. The memory capacity of networks subject to saturating nonlinearities is further limited, and cannot exceed N. This limit can be realized by feedforward structures with divergent fan out that distributes the signal across neurons, thereby avoiding saturation. We illustrate the generality of the theory by showing that memory in fluid systems can be sustained by transient nonnormal amplification due to convective instability or the onset of turbulence.

[1]  Tobias J. Hagge,et al.  Physics , 1929, Nature.

[2]  Anne E. Trefethen,et al.  Hydrodynamic Stability Without Eigenvalues , 1993, Science.

[3]  H Sompolinsky,et al.  Simple models for reading neuronal population codes. , 1993, Proceedings of the National Academy of Sciences of the United States of America.

[4]  H S Seung,et al.  How the brain keeps the eyes still. , 1996, Proceedings of the National Academy of Sciences of the United States of America.

[5]  C. Cossu,et al.  Global Measures of Local Convective Instabilities , 1997 .

[6]  Henry Markram,et al.  Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations , 2002, Neural Computation.

[7]  H. Sompolinsky,et al.  Temporal integration by calcium dynamics in a model neuron , 2003, Nature Neuroscience.

[8]  Chrisantha Fernando,et al.  Pattern Recognition in a Bucket , 2003, ECAL.

[9]  Harald Haas,et al.  Harnessing Nonlinearity: Predicting Chaotic Systems and Saving Energy in Wireless Communication , 2004, Science.

[10]  Haim Sompolinsky,et al.  Short-term memory in orthogonal neural networks. , 2004, Physical review letters.

[11]  L. Trefethen,et al.  Spectra and pseudospectra : the behavior of nonnormal matrices and operators , 2005 .

[12]  M. Tsodyks,et al.  Synaptic Theory of Working Memory , 2008, Science.

[13]  L. Trefethen,et al.  Spectra and Pseudospectra , 2020 .