Biologically-Based Computation: How Neural Details and Dynamics Are Suited for Implementing a Variety of Algorithms

The Neural Engineering Framework (Eliasmith & Anderson, 2003) is a long-standing method for implementing high-level algorithms constrained by low-level neurobiological details. In recent years, this method has been expanded to incorporate more biological details and applied to new tasks. This paper brings together these ongoing research strands, presenting them in a common framework. We expand on the NEF’s core principles of (a) specifying the desired tuning curves of neurons in different parts of the model, (b) defining the computational relationships between the values represented by the neurons in different parts of the model, and (c) finding the synaptic connection weights that will cause those computations and tuning curves. In particular, we show how to extend this to include complex spatiotemporal tuning curves, and then apply this approach to produce functional computational models of grid cells, time cells, path integration, sparse representations, probabilistic representations, and symbolic representations in the brain.

[1]  C. Eliasmith,et al.  Constructing functional models from biophysically-detailed neurons , 2022, PLoS computational biology.

[2]  C. Eliasmith,et al.  Computational properties of multi-compartment LIF neurons with passive dendrites , 2022, Neuromorph. Comput. Eng..

[3]  Sergio Gomez Colmenarejo,et al.  A Generalist Agent , 2022, Trans. Mach. Learn. Res..

[4]  B. Olshausen,et al.  Computing on Functions Using Randomized Vector Representations (in brief) , 2022, NICE.

[5]  Andreas Stöckel Harnessing Neural Dynamics as a Computational Resource , 2022 .

[6]  P. M. Furlong,et al.  Learned Legendre Predictor: Learning with Compressed Representations for Efficient Online Multistep Prediction , 2022 .

[7]  P. M. Furlong Fractional Binding in Vector Symbolic Architectures as Quasi-Probability Statements , 2022 .

[8]  P. M. Furlong,et al.  Fractional Binding in Vector Symbolic Representations for Efficient Mutual Information Exploration , 2022 .

[9]  Chris Eliasmith,et al.  Language Modeling using LMUs: 10x Better Data Efficiency or Improved Scaling Compared to Transformers , 2021, ArXiv.

[10]  Bruno A. Olshausen,et al.  Computing on Functions Using Randomized Vector Representations (in brief) , 2021, NICE.

[11]  Peter Blouw,et al.  Simulating and Predicting Dynamical Systems With Spatial Semantic Pointers , 2021, Neural Computation.

[12]  Terrence C. Stewart,et al.  Connecting Biological Detail With Neural Computation: Application to the Cerebellar Granule-Golgi Microcircuit , 2021, Top. Cogn. Sci..

[13]  Huajin Tang,et al.  Why grid cells function as a metric for space , 2021, Neural Networks.

[14]  Chris Eliasmith,et al.  Parallelizing Legendre Memory Unit Training , 2021, ICML.

[15]  E. De Schutter,et al.  The Cellular Electrophysiological Properties Underlying Multiplexed Coding in Purkinje Cells , 2021, The Journal of Neuroscience.

[16]  Chris Eliasmith,et al.  Passive Nonlinear Dendritic Interactions as a Computational Resource in Spiking Neural Networks , 2020, Neural Computation.

[17]  Jonathan F. Kominsky,et al.  Causality and continuity close the gaps in event representations , 2020, Memory & Cognition.

[18]  Brent Komer,et al.  Biologically Inspired Spatial Representation , 2020 .

[19]  C. Eliasmith,et al.  A Biologically Plausible Spiking Neural Model of Eyeblink Conditioning in the Cerebellum , 2020, CogSci.

[20]  C. Eliasmith,et al.  Accurate representation for spatial cognition using grid cells , 2020, CogSci.

[21]  E. De Schutter,et al.  Firing rate-dependent phase responses of Purkinje cells support transient oscillations , 2019, bioRxiv.

[22]  Chris Eliasmith,et al.  Legendre Memory Units: Continuous-Time Representation in Recurrent Neural Networks , 2019, NeurIPS.

[23]  Terrence C. Stewart,et al.  Flexible timing with delay networks–The scalar property and neural scaling , 2019 .

[24]  Panayiota Poirazi,et al.  Challenging the point neuron dogma: FS basket cells as 2-stage nonlinear integrators , 2018, Nature Communications.

[25]  Terrence C. Stewart,et al.  A neural representation of continuous space using fractional binding , 2019, CogSci.

[26]  Feng-Xuan Choo,et al.  Spaun 2.0: Extending the World’s Largest Functional Brain Model , 2018 .

[27]  Øyvind Arne Høydal,et al.  Object-vector coding in the medial entorhinal cortex , 2018, bioRxiv.

[28]  Chris Eliasmith,et al.  Improving Spiking Dynamical Networks: Accurate Delays, Higher-Order Synapses, and Time Cells , 2018, Neural Computation.

[29]  Nicholas A. Lusk,et al.  Cerebellar, hippocampal, and striatal time cells , 2016, Current Opinion in Behavioral Sciences.

[30]  Demis Hassabis,et al.  Mastering the game of Go with deep neural networks and tree search , 2016, Nature.

[31]  A. Hall,et al.  Adaptive Switching Circuits , 2016 .

[32]  Jeff Orchard,et al.  Oscillator-Interference Models of Path Integration Do Not Require Theta Oscillations , 2015, Neural Computation.

[33]  Qian Du,et al.  A Unified Mathematical Framework for Coding Time, Space, and Sequences in the Hippocampal Region , 2014, The Journal of Neuroscience.

[34]  Trevor Bekolay,et al.  Nengo: a Python tool for building large-scale functional brain models , 2014, Front. Neuroinform..

[35]  Jason S. Rothman,et al.  Modeling Synapses , 2014, Encyclopedia of Computational Neuroscience.

[36]  Chris Eliasmith,et al.  How to Build a Brain: A Neural Architecture for Biological Cognition , 2013 .

[37]  Trevor Bekolay,et al.  A Large-Scale Model of the Functioning Brain , 2012, Science.

[38]  S. Furber,et al.  To build a brain , 2012, IEEE Spectrum.

[39]  Chris Eliasmith,et al.  Fine-Tuning and the Stability of Recurrent Neural Networks , 2011, PloS one.

[40]  H. Eichenbaum,et al.  Hippocampal “Time Cells” Bridge the Gap in Memory for Discontiguous Events , 2011, Neuron.

[41]  Pierre Pica,et al.  Flexible intuitions of Euclidean geometry in an Amazonian indigene group , 2011, Proceedings of the National Academy of Sciences.

[42]  Alessandro Treves,et al.  How Informative Are Spatial CA3 Representations Established by the Dentate Gyrus? , 2009, PLoS Comput. Biol..

[43]  Bart Farell,et al.  Is perceptual space inherently non-Euclidean? , 2009, Journal of mathematical psychology.

[44]  Bryan P. Tripp,et al.  A Search For Principles of Basal Ganglia Function , 2009 .

[45]  M. Moser,et al.  Representation of Geometric Borders in the Entorhinal Cortex , 2008, Science.

[46]  M. Fyhn,et al.  Progressive increase in grid scale from dorsal to ventral medial entorhinal cortex , 2008, Hippocampus.

[47]  Asohan Amarasingham,et al.  Internally Generated Cell Assembly Sequences in the Rat Hippocampus , 2008, Science.

[48]  B. Schölkopf,et al.  Kernel methods in machine learning , 2007, math/0701907.

[49]  Benjamin Recht,et al.  Random Features for Large-Scale Kernel Machines , 2007, NIPS.

[50]  Torkel Hafting,et al.  Conjunctive Representation of Position, Direction, and Velocity in Entorhinal Cortex , 2006, Science.

[51]  T. Hafting,et al.  Microstructure of a spatial map in the entorhinal cortex , 2005, Nature.

[52]  Philipp Slusallek,et al.  Introduction to real-time ray tracing , 2005, SIGGRAPH Courses.

[53]  Ross W. Gayler Vector Symbolic Architectures answer Jackendoff's challenges for cognitive neuroscience , 2004, ArXiv.

[54]  Chris Eliasmith,et al.  Neural Engineering: Computation, Representation, and Dynamics in Neurobiological Systems , 2004, IEEE Transactions on Neural Networks.

[55]  Ingrid K. Glad,et al.  Correction of Density Estimators that are not Densities , 2003 .

[56]  Javier F. Medina,et al.  Computer simulation of cerebellar information processing , 2000, Nature Neuroscience.

[57]  J. Anthony Movshon,et al.  Linearity and gain control in V1 simple cells , 1999 .

[58]  John R. Anderson,et al.  ACT-R: A Theory of Higher Level Cognition and Its Relation to Visual Attention , 1997, Hum. Comput. Interact..

[59]  B. McNaughton,et al.  Spatial information content and reliability of hippocampal CA1 neurons: Effects of visual input , 1994, Hippocampus.

[60]  Geoffrey E. Hinton,et al.  Distributed representations and nested compositional structure , 1994 .

[61]  Allen Newell,et al.  SOAR: An Architecture for General Intelligence , 1987, Artif. Intell..

[62]  D. Field,et al.  The structure and symmetry of simple-cell receptive-field profiles in the cat’s visual cortex , 1986, Proceedings of the Royal Society of London. Series B. Biological Sciences.

[63]  James L. McClelland,et al.  Parallel distributed processing: explorations in the microstructure of cognition, vol. 1: foundations , 1986 .

[64]  E H Adelson,et al.  Spatiotemporal energy models for the perception of motion. , 1985, Journal of the Optical Society of America. A, Optics and image science.

[65]  Andrew B. Watson,et al.  A look at motion in the frequency domain , 1983 .

[66]  P. Thorndyke Distance estimation from cognitive maps , 1981, Cognitive Psychology.

[67]  S Marcelja,et al.  Mathematical description of the responses of simple cortical cells. , 1980, Journal of the Optical Society of America.

[68]  J. O'Keefe,et al.  The hippocampus as a spatial map. Preliminary evidence from unit activity in the freely-moving rat. , 1971, Brain research.

[69]  D. Hubel,et al.  Receptive fields of single neurones in the cat's striate cortex , 1959, The Journal of physiology.

[70]  E. Tolman Cognitive maps in rats and men. , 1948, Psychological review.