Fast and Efficient Asynchronous Neural Computation with Adapting Spiking Neural Networks

textabstractBiological neurons communicate with a sparing exchange of pulses - spikes. It is an open question how real spiking neurons produce the kind of powerful neural computation that is possible with deep artificial neural networks, using only so very few spikes to communicate. Building on recent insights in neuroscience, we present an Adapting Spiking Neural Network (ASNN) based on adaptive spiking neurons. These spiking neurons efficiently encode information in spike-trains using a form of Asynchronous Pulsed Sigma-Delta coding while homeostatically optimizing their firing rate. In the proposed paradigm of spiking neuron computation, neural adaptation is tightly coupled to synaptic plasticity, to ensure that downstream neurons can correctly decode upstream spiking neurons. We show that this type of network is inherently able to carry out asynchronous and event-driven neural computation, while performing identical to corresponding artificial neural networks (ANNs). In particular, we show that these adaptive spiking neurons can be drop in replacements for ReLU neurons in standard feedforward ANNs comprised of such units. We demonstrate that this can also be successfully applied to a ReLU based deep convolutional neural network for classifying the MNIST dataset. The ASNN thus outperforms current Spiking Neural Networks (SNNs) implementations, while responding (up to) an order of magnitude faster and using an order of magnitude fewer spikes. Additionally, in a streaming setting where frames are continuously classified, we show that the ASNN requires substantially fewer network updates as compared to the corresponding ANN.

[1]  S. Laughlin,et al.  An Energy Budget for Signaling in the Grey Matter of the Brain , 2001, Journal of cerebral blood flow and metabolism : official journal of the International Society of Cerebral Blood Flow and Metabolism.

[2]  Matthew Cook,et al.  Fast-classifying, high-accuracy spiking deep networks through weight and threshold balancing , 2015, 2015 International Joint Conference on Neural Networks (IJCNN).

[3]  Romain Brette Spiking Models for Level-Invariant Encoding , 2012, Front. Comput. Neurosci..

[4]  Matthew Chalk,et al.  Efficiency turns the table on neural encoding, decoding and noise , 2016, Current Opinion in Neurobiology.

[5]  Terrence J. Sejnowski,et al.  Analysis of hidden units in a layered network trained to classify sonar targets , 1988, Neural Networks.

[6]  Wulfram Gerstner,et al.  Spiking Neuron Models , 2002 .

[7]  Shih-Chii Liu,et al.  Learning to be efficient: algorithms for training low-latency, low-compute deep spiking neural networks , 2016, SAC.

[8]  Wulfram Gerstner,et al.  SPIKING NEURON MODELS Single Neurons , Populations , Plasticity , 2002 .

[9]  Sophie Denève,et al.  Spike-Based Population Coding and Working Memory , 2011, PLoS Comput. Biol..

[10]  Sander Bohte,et al.  Streaming parallel GPU acceleration of large-scale filter-based spiking neural networks , 2012, Network.

[11]  Andrew S. Cassidy,et al.  Convolutional networks for fast, energy-efficient neuromorphic computing , 2016, Proceedings of the National Academy of Sciences.

[12]  Sander M. Bohte,et al.  Fractionally Predictive Spiking Neurons , 2010, NIPS.

[13]  Tobi Delbruck,et al.  Real-time classification and sensor fusion with a spiking deep belief network , 2013, Front. Neurosci..

[14]  W. Gerstner,et al.  Temporal whitening by power-law adaptation in neocortical neurons , 2013, Nature Neuroscience.

[15]  Sander M. Bohte,et al.  Efficient Spike-Coding with Multiplicative Adaptation in a Spike Response Model , 2012, NIPS.

[16]  Young C. Yoon,et al.  LIF and Simplified SRM Neurons Encode Signals Into Spikes via a Form of Asynchronous Pulse Sigma–Delta Modulation , 2017, IEEE Transactions on Neural Networks and Learning Systems.

[17]  Adrienne L. Fairhall,et al.  Efficiency and ambiguity in an adaptive neural code , 2001, Nature.

[18]  Rasmus Berg Palm,et al.  Prediction as a candidate for learning deep hierarchical models of data , 2012 .

[19]  Andrew S. Cassidy,et al.  A million spiking-neuron integrated circuit with a scalable communication network and interface , 2014, Science.

[20]  Yoshua Bengio,et al.  Gradient-based learning applied to document recognition , 1998, Proc. IEEE.

[21]  William Bialek,et al.  Spikes: Exploring the Neural Code , 1996 .

[22]  Deepak Khosla,et al.  Spiking Deep Convolutional Neural Networks for Energy-Efficient Object Recognition , 2014, International Journal of Computer Vision.

[23]  Wolfgang Maass,et al.  Networks of Spiking Neurons: The Third Generation of Neural Network Models , 1996, Electron. Colloquium Comput. Complex..

[24]  Nitish Srivastava,et al.  Dropout: a simple way to prevent neural networks from overfitting , 2014, J. Mach. Learn. Res..

[25]  Daniel Soudry,et al.  "Neuronal spike generation mechanism as an oversampling, noise-shaping A-to-D converter" , 2012, NIPS.

[26]  Edwin Olson,et al.  A passive solution to the sensor synchronization problem , 2010, 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems.