Time cells might be optimized for predictive capacity, not redundancy reduction or memory capacity.

Recently, researchers have found time cells in the hippocampus that appear to contain information about the timing of past events. Some researchers have argued that time cells are taking a Laplace transform of their input in order to reconstruct the past stimulus. We argue that stimulus prediction, not stimulus reconstruction or redundancy reduction, is in better agreement with observed responses of time cells. In the process, we introduce new analyses of nonlinear, continuous-time reservoirs that model these time cells.

[1]  H. Eichenbaum Time cells in the hippocampus: a new dimension for mapping memories , 2014, Nature Reviews Neuroscience.

[2]  Michael J. Berry,et al.  Predictive information in a sensory population , 2013, Proceedings of the National Academy of Sciences.

[3]  H. B. Barlow,et al.  Possible Principles Underlying the Transformations of Sensory Messages , 2012 .

[4]  Herbert Jaeger,et al.  Reservoir computing approaches to recurrent neural network training , 2009, Comput. Sci. Rev..

[5]  Marc W Howard,et al.  Neural Scaling Laws for an Uncertain World , 2016, Psychological review.

[6]  Igor Farkas,et al.  Computational analysis of memory capacity in echo state networks , 2016, Neural Networks.

[7]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[8]  O. Marre,et al.  Toward a unified theory of efficient, predictive, and sparse coding , 2017, Proceedings of the National Academy of Sciences.

[9]  Michael N. Shadlen,et al.  A Neural Mechanism for Sensing and Reproducing a Time Interval , 2015, Current Biology.

[10]  Qian Du,et al.  A Unified Mathematical Framework for Coding Time, Space, and Sequences in the Hippocampal Region , 2014, The Journal of Neuroscience.

[11]  H. Eichenbaum,et al.  Hippocampal “Time Cells” Bridge the Gap in Memory for Discontiguous Events , 2011, Neuron.

[12]  Herbert Jaeger,et al.  The''echo state''approach to analysing and training recurrent neural networks , 2001 .

[13]  Benjamin J. Kraus,et al.  Hippocampal “Time Cells”: Time versus Path Integration , 2013, Neuron.

[14]  Rafal Bogacz,et al.  An Approximation of the Error Backpropagation Algorithm in a Predictive Coding Network with Local Hebbian Synaptic Plasticity , 2017, Neural Computation.

[15]  Susanne Still,et al.  Optimal causal inference: estimating stored information and approximating causal architecture. , 2007, Chaos.

[16]  Haim Sompolinsky,et al.  Short-term memory in orthogonal neural networks. , 2004, Physical review letters.

[17]  L. Abbott,et al.  Stimulus-dependent suppression of chaos in recurrent neural networks. , 2009, Physical review. E, Statistical, nonlinear, and soft matter physics.

[18]  Benjamin Schrauwen,et al.  Memory in linear recurrent neural networks in continuous time , 2010, Neural Networks.

[19]  T. Sejnowski,et al.  Independent component analysis at the neural cocktail party , 2001, Trends in Neurosciences.

[20]  Juan-Pablo Ortega,et al.  Echo state networks are universal , 2018, Neural Networks.

[21]  Igor Farkas,et al.  Memory Capacity of Input-Driven Echo State Networks at the Edge of Chaos , 2014, ICANN.

[22]  Minoru Asada,et al.  Information processing in echo state networks at the edge of chaos , 2011, Theory in Biosciences.

[23]  G. Buzsáki,et al.  Theta Oscillations Provide Temporal Windows for Local Circuit Computation in the Entorhinal-Hippocampal Loop , 2009, Neuron.

[24]  P. Glimcher Understanding dopamine and reinforcement learning: The dopamine reward prediction error hypothesis , 2011, Proceedings of the National Academy of Sciences.

[25]  Sarah Marzen,et al.  The difference between memory and prediction in linear recurrent networks , 2017, Physical review. E.

[26]  J. Atick,et al.  STATISTICS OF NATURAL TIME-VARYING IMAGES , 1995 .

[27]  James P. Crutchfield,et al.  Predictive Rate-Distortion for Infinite-Order Markov Processes , 2016 .