On the accuracy and computational cost of spiking neuron implementation.

Since more than a decade ago, three statements about spiking neuron (SN) implementations have been widely accepted: 1) Hodgkin and Huxley (HH) model is computationally prohibitive, 2) Izhikevich (IZH) artificial neuron is as efficient as Leaky Integrate-and-Fire (LIF) model, and 3) IZH model is more efficient than HH model (Izhikevich, 2004). As suggested by Hodgkin and Huxley (1952), their model operates in two modes: by using the α's and β's rate functions directly (HH model) and by storing them into tables (HHT model) for computational cost reduction. Recently, it has been stated that: 1) HHT model (HH using tables) is not prohibitive, 2) IZH model is not efficient, and 3) both HHT and IZH models are comparable in computational cost (Skocik & Long, 2014). That controversy shows that there is no consensus concerning SN simulation capacities. Hence, in this work, we introduce a refined approach, based on the multiobjective optimization theory, describing the SN simulation capacities and ultimately choosing optimal simulation parameters. We have used normalized metrics to define the capacity levels of accuracy, computational cost, and efficiency. Normalized metrics allowed comparisons between SNs at the same level or scale. We conducted tests for balanced, lower, and upper boundary conditions under a regular spiking mode with constant and random current stimuli. We found optimal simulation parameters leading to a balance between computational cost and accuracy. Importantly, and, in general, we found that 1) HH model (without using tables) is the most accurate, computationally inexpensive, and efficient, 2) IZH model is the most expensive and inefficient, 3) both LIF and HHT models are the most inaccurate, 4) HHT model is more expensive and inaccurate than HH model due to α's and β's table discretization, and 5) HHT model is not comparable in computational cost to IZH model. These results refute the theory formulated over a decade ago (Izhikevich, 2004) and go more in-depth in the statements formulated by Skocik and Long (2014). Our statements imply that the number of dimensions or FLOPS in the SNs are theoretical but not practical indicators of the true computational cost. The metric we propose for the computational cost is more precise than FLOPS and was found to be invariant to computer architecture. Moreover, we found that the firing frequency used in previous works is a necessary but an insufficient metric to evaluate the simulation accuracy. We also show that our results are consistent with the theory of numerical methods and the theory of SN discontinuity. Discontinuous SNs, such LIF and IZH models, introduce a considerable error every time a spike is generated. In addition, compared to the constant input current, the random input current increases the computational cost and inaccuracy. Besides, we found that the search for optimal simulation parameters is problem-specific. That is important because most of the previous works have intended to find a general and unique optimal simulation. Here, we show that this solution could not exist because it is a multiobjective optimization problem that depends on several factors. This work sets up a renewed thesis concerning the SN simulation that is useful to several related research areas, including the emergent Deep Spiking Neural Networks.

[1]  Markus Diesmann,et al.  Exact Subthreshold Integration with Continuous Spike Times in Discrete-Time Neural Network Simulations , 2007, Neural Computation.

[2]  J W Moore,et al.  On numerical integration of the Hodgkin and Huxley equations for a membrane action potential. , 1974, Journal of theoretical biology.

[3]  Hirotaka Nakayama,et al.  Theory of Multiobjective Optimization , 1985 .

[4]  Steve B. Furber,et al.  Accuracy and Efficiency in Fixed-Point Neural ODE Solvers , 2015, Neural Computation.

[5]  Steven C. Chapra,et al.  Numerical Methods for Engineers , 1986 .

[6]  Carver Mead,et al.  Analog VLSI and neural systems , 1989 .

[7]  Jonathan Touboul,et al.  On the Simulation of Nonlinear Bidimensional Spiking Neuron Models , 2010, Neural Computation.

[8]  J. Jack,et al.  Electric current flow in excitable cells , 1975 .

[9]  C. Morris,et al.  Voltage oscillations in the barnacle giant muscle fiber. , 1981, Biophysical journal.

[10]  Gert Cauwenberghs,et al.  Large-Scale Neuromorphic Spiking Array Processors: A Quest to Mimic the Brain , 2018, Front. Neurosci..

[11]  Alois Knoll,et al.  Neuromorphic implementations of neurobiological learning algorithms for spiking neural networks , 2015, Neural Networks.

[12]  Nabil H. Farhat,et al.  The double queue method: a numerical method for integrate-and-fire neuron networks , 2001, Neural Networks.

[13]  Stefan Rotter,et al.  Exact digital simulation of time-invariant linear systems with applications to neuronal modeling , 1999, Biological Cybernetics.

[14]  Stephan Henker,et al.  Accuracy evaluation of numerical methods used in state-of-the-art simulators for spiking neural networks , 2011, Journal of Computational Neuroscience.

[15]  Lyle N. Long,et al.  On the Capabilities and Computational Costs of Neuron Models , 2014, IEEE Transactions on Neural Networks and Learning Systems.

[16]  C. Willmott,et al.  Advantages of the mean absolute error (MAE) over the root mean square error (RMSE) in assessing average model performance , 2005 .

[17]  Szabolcs Káli,et al.  A flexible, interactive software tool for fitting the parameters of neuronal models , 2014, Front. Neuroinform..

[18]  Ankur Gupta,et al.  Biologically-inspired spiking neural networks with Hebbian learning for vision processing , 2008 .

[19]  Johannes Schemmel,et al.  Spiking neurons with short-term synaptic plasticity form superior generative networks , 2018, Scientific Reports.

[20]  Germán Mato,et al.  On Numerical Simulations of Integrate-and-Fire Neural Networks , 1998, Neural Computation.

[21]  Michael Pfeiffer,et al.  Deep Learning With Spiking Neurons: Opportunities and Challenges , 2018, Front. Neurosci..

[22]  C. Kambhampati,et al.  Spiking Neurons: Is coincidence-factor enough for comparing responses with fluctuating membrane voltage? , 2008 .

[23]  Wulfram Gerstner,et al.  A History of Spike-Timing-Dependent Plasticity , 2011, Front. Syn. Neurosci..

[24]  Ernst Hairer,et al.  Solving Ordinary Differential Equations I: Nonstiff Problems , 2009 .

[25]  Y. Dan,et al.  Spike Timing-Dependent Plasticity of Neural Circuits , 2004, Neuron.

[26]  Michael L. Hines,et al.  Open Source Brain: A Collaborative Resource for Visualizing, Analyzing, Simulating, and Developing Standardized Models of Neurons and Circuits , 2018, Neuron.

[27]  Sander M. Bohte,et al.  The evidence for neural information processing with precise spike-times: A survey , 2004, Natural Computing.

[28]  W. Gerstner,et al.  Time structure of the activity in neural network models. , 1995, Physical review. E, Statistical physics, plasmas, fluids, and related interdisciplinary topics.

[29]  L F Abbott,et al.  Decoding neuronal firing and modelling neural networks , 1994, Quarterly Reviews of Biophysics.

[30]  Humberto Sossa,et al.  The step size impact on the computational cost of spiking neuron simulation , 2017, 2017 Computing Conference.

[31]  Aaditya V. Rangan,et al.  Fast numerical methods for simulating large-scale integrate-and-fire neuronal networks , 2007, Journal of Computational Neuroscience.

[32]  Wyeth Bair,et al.  Spiking neural network simulation: numerical integration with the Parker-Sochacki method , 2009, Journal of Computational Neuroscience.

[33]  Wulfram Gerstner,et al.  Reduction of the Hodgkin-Huxley Equations to a Single-Variable Threshold Model , 1997, Neural Computation.

[34]  Nikola Kasabov,et al.  Time-Space, Spiking Neural Networks and Brain-Inspired Artificial Intelligence , 2018, Springer Series on Bio- and Neurosystems.

[35]  Jonathan Touboul,et al.  Importance of the Cutoff Value in the Quadratic Adaptive Integrate-and-Fire Model , 2008, Neural Computation.

[36]  J. Hindmarsh,et al.  A model of neuronal bursting using three coupled first order differential equations , 1984, Proceedings of the Royal Society of London. Series B. Biological Sciences.

[37]  Micheal V. Mascagni Numerical methods for neuronal modeling , 1989 .

[38]  Nicholas T. Carnevale,et al.  Simulation of networks of spiking neurons: A review of tools and strategies , 2006, Journal of Computational Neuroscience.

[39]  Gary B. Lamont,et al.  Evolutionary Algorithms for Solving Multi-Objective Problems , 2002, Genetic Algorithms and Evolutionary Computation.

[40]  N. Rulkov Regularization of synchronized chaotic bursts. , 2000, Physical review letters.

[41]  Jonathan Touboul,et al.  Sensitivity to the cutoff value in the quadratic adaptive integrate-and-fire model , 2013 .

[42]  T. Sejnowski Statistical constraints on synaptic plasticity. , 1977, Journal of theoretical biology.

[43]  Wulfram Gerstner,et al.  A benchmark test for a quantitative assessment of simple neuron models , 2008, Journal of Neuroscience Methods.

[44]  Bruce W. Knight,et al.  Dynamics of Encoding in a Population of Neurons , 1972, The Journal of general physiology.

[45]  L. Abbott,et al.  Synaptic plasticity: taming the beast , 2000, Nature Neuroscience.

[46]  Wulfram Gerstner,et al.  Why spikes? Hebbian learning and retrieval of time-resolved excitation patterns , 1993, Biological Cybernetics.

[47]  Eugene M. Izhikevich,et al.  Which model to use for cortical spiking neurons? , 2004, IEEE Transactions on Neural Networks.

[48]  Sander M. Bohte,et al.  Computing with Spiking Neuron Networks , 2012, Handbook of Natural Computing.

[49]  Cyrille Rossant,et al.  Automatic Fitting of Spiking Neuron Models to Electrophysiological Recordings , 2010, Front. Neuroinform..

[50]  R. FitzHugh Impulses and Physiological States in Theoretical Models of Nerve Membrane. , 1961, Biophysical journal.

[51]  R. Stein A THEORETICAL ANALYSIS OF NEURONAL VARIABILITY. , 1965, Biophysical journal.

[52]  James G. King,et al.  Reconstruction and Simulation of Neocortical Microcircuitry , 2015, Cell.

[53]  V I Nekorkin,et al.  Chaotic oscillations in a map-based model of neural activity. , 2007, Chaos.

[54]  Desmond J. Higham,et al.  Numerical Methods for Ordinary Differential Equations - Initial Value Problems , 2010, Springer undergraduate mathematics series.

[55]  G. Edelman,et al.  Large-scale model of mammalian thalamocortical systems , 2008, Proceedings of the National Academy of Sciences.

[56]  Wofgang Maas,et al.  Networks of spiking neurons: the third generation of neural network models , 1997 .

[57]  Wolfgang Hackbusch,et al.  The Concept of Stability in Numerical Mathematics , 2014 .

[58]  A. Hodgkin,et al.  A quantitative description of membrane current and its application to conduction and excitation in nerve , 1952, The Journal of physiology.

[59]  Wulfram Gerstner,et al.  The quantitative single-neuron modeling competition , 2008, Biological Cybernetics.

[60]  Pablo Varona,et al.  Modeling Biological Neural Networks , 2012, Handbook of Natural Computing.

[61]  David A. Pope An exponential method of numerical integration of ordinary differential equations , 1963, CACM.

[62]  Walter Gautschi,et al.  Numerical Analysis , 1978, Mathemagics: A Magical Journey Through Advanced Mathematics.

[63]  James M. Bower,et al.  Rallpacks: a set of benchmarks for neuronal simulators , 1992, Trends in Neurosciences.

[64]  C. Eliasmith,et al.  The use and abuse of large-scale brain models , 2014, Current Opinion in Neurobiology.

[65]  Hojjat Adeli,et al.  Spiking Neural Networks , 2009, Int. J. Neural Syst..

[66]  Eugene M. Izhikevich,et al.  Simple model of spiking neurons , 2003, IEEE Trans. Neural Networks.

[67]  Gert Cauwenberghs,et al.  Neuromorphic Silicon Neuron Circuits , 2011, Front. Neurosci.

[68]  Hojjat Adeli,et al.  Third Generation Neural Networks: Spiking Neural Networks , 2009 .

[69]  Rüdiger W. Brause,et al.  The Performance of Approximating Ordinary Differential Equations by Neural Nets , 2008, 2008 20th IEEE International Conference on Tools with Artificial Intelligence.

[70]  Eugene M. Izhikevich,et al.  Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting , 2006 .

[71]  Robert Plonsey,et al.  Bioelectricity: A Quantitative Approach Duke University’s First MOOC , 2013 .

[72]  Wulfram Gerstner,et al.  Spiking Neuron Models , 2002 .

[73]  Felix Schürmann,et al.  Single Neuron Optimization as a Basis for Accurate Biophysical Modeling: The Case of Cerebellar Granule Cells , 2017, Front. Cell. Neurosci..

[74]  Steve Furber,et al.  Large-scale neuromorphic computing systems , 2016, Journal of neural engineering.

[75]  J. Butcher Numerical methods for ordinary differential equations , 2003 .

[76]  S. Yoshizawa,et al.  An Active Pulse Transmission Line Simulating Nerve Axon , 1962, Proceedings of the IRE.

[77]  Juan Humberto Sossa Azuela,et al.  How the Accuracy and Computational Cost of Spiking Neuron Simulation are Affected by the Time Span and Firing Rate , 2017, Computación y Sistemas.

[78]  D. Feldman The Spike-Timing Dependence of Plasticity , 2012, Neuron.

[79]  Timothée Masquelier,et al.  Deep Learning in Spiking Neural Networks , 2018, Neural Networks.

[80]  Christopher J. Bishop,et al.  Pulsed Neural Networks , 1998 .

[81]  Bertrand Fontaine,et al.  Fitting Neuron Models to Spike Trains , 2011, Front. Neurosci..

[82]  H. Wilson Spikes, Decisions, and Actions: The Dynamical Foundations of Neuroscience , 1999 .

[83]  Ronald J. MacGregor,et al.  Neural and brain modeling , 1987 .

[84]  Trevor Bekolay,et al.  A Large-Scale Model of the Functioning Brain , 2012, Science.

[85]  Kalyanmoy Deb,et al.  Multi-objective optimization using evolutionary algorithms , 2001, Wiley-Interscience series in systems and optimization.

[86]  T. Chai,et al.  Root mean square error (RMSE) or mean absolute error (MAE)? – Arguments against avoiding RMSE in the literature , 2014 .

[87]  Wulfram Gerstner,et al.  Neuronal Dynamics: From Single Neurons To Networks And Models Of Cognition , 2014 .

[88]  Eduardo Ros,et al.  Event- and Time-Driven Techniques Using Parallel CPU-GPU Co-processing for Spiking Neural Networks , 2017, Front. Neuroinform..