Spiking Hyperdimensional Network: Neuromorphic Models Integrated with Memory-Inspired Framework

Recently, brain-inspired computing models have shown great potential to outperform today’s deep learning solutions in terms of robustness and energy efficiency. Particularly, Spiking Neural Networks (SNNs) and HyperDimensional Computing (HDC) have shown promising results in enabling efficient and robust cognitive learning. Despite the success, these two brain-inspired models have different strengths. While SNN mimics the physical properties of the human brain, HDC models the brain on a more abstract and functional level. Their design philosophies demonstrate complementary patterns that motivate their combination. With the help of the classical psychological model on memory, we propose SpikeHD, the first framework that fundamentally combines Spiking neural network and hyperdimensional computing. SpikeHD generates a scalable and strong cognitive learning system that better mimics brain functionality. SpikeHD exploits spiking neural networks to extract low-level features by preserving the spatial and temporal correlation of raw event-based spike data. Then, it utilizes HDC to operate over SNN output by mapping the signal into high-dimensional space, learning the abstract information, and classifying the data. Our extensive evaluation on a set of benchmark classification problems shows that SpikeHD provides the following benefit compared to SNN architecture: (1) significantly enhance learning capability by exploiting two-stage information processing, (2) enables substantial robustness to noise and failure, and (3) reduces the network size and required parameters to learn complex information.

[1]  Michael Schmuker,et al.  A neuromorphic network for generic multivariate data classification , 2014, Proceedings of the National Academy of Sciences.

[2]  S. Furber,et al.  Comparison of Artificial and Spiking Neural Networks on Digital Hardware , 2021, Frontiers in Neuroscience.

[3]  Tajana Simunic,et al.  Revisiting HyperDimensional Learning for FPGA and Low-Power Architectures , 2021, 2021 IEEE International Symposium on High-Performance Computer Architecture (HPCA).

[4]  Tajana Simunic,et al.  BRIC: Locality-based Encoding for Energy-Efficient Brain-Inspired Hyperdimensional Computing , 2019, 2019 56th ACM/IEEE Design Automation Conference (DAC).

[5]  R. Douglas,et al.  Event-Based Neuromorphic Systems , 2015 .

[6]  Adrienne L. Fairhall,et al.  Fast and Flexible Sequence Induction In Spiking Neural Networks Via Rapid Excitability Changes , 2018 .

[7]  Anna Demming,et al.  The best of both worlds , 2010, Nanotechnology.

[8]  Benjamin Recht,et al.  Random Features for Large-Scale Kernel Machines , 2007, NIPS.

[9]  Luca Benini,et al.  In-memory hyperdimensional computing , 2019, Nature Electronics.

[10]  F. Merrikh Bayat,et al.  Spike-timing-dependent plasticity learning of coincidence detection with passively integrated memristive circuits , 2018, Nature Communications.

[11]  Mohsen Imani,et al.  Scalable edge-based hyperdimensional learning system with brain-like neural adaptation , 2021, SC.

[12]  A S Spinelli,et al.  Memristive neural network for on-line learning and tracking with brain-inspired spike timing dependent plasticity , 2017, Scientific Reports.

[13]  Terrence J. Sejnowski,et al.  Gradient Descent for Spiking Neural Networks , 2017, NeurIPS.

[14]  Luca Benini,et al.  Robust high-dimensional memory-augmented neural networks , 2020, Nature Communications.

[15]  Tony A. Plate,et al.  Holographic reduced representations , 1995, IEEE Trans. Neural Networks.

[16]  Geoffrey W. Burr,et al.  A role for analogue memory in AI hardware , 2019, Nat. Mach. Intell..

[17]  Kaushik Roy,et al.  Going Deeper in Spiking Neural Networks: VGG and Residual Architectures , 2018, Front. Neurosci..

[18]  Anders Holst,et al.  Random indexing of text samples for latent semantic analysis , 2000 .

[19]  Nikolaos Papakonstantinou,et al.  Hyperdimensional Computing in Industrial Systems: The Use-Case of Distributed Fault Isolation in a Power Plant , 2018, IEEE Access.

[20]  Misha Denil,et al.  Predicting Parameters in Deep Learning , 2014 .

[21]  Stefano Fusi,et al.  Building brain-inspired computing , 2019, Nature Communications.

[22]  Carver Mead,et al.  How we created neuromorphic engineering , 2020, Nature Electronics.

[23]  Bernabé Linares-Barranco,et al.  Memristance can explain Spike-Time-Dependent-Plasticity in Neural Synapses , 2009 .

[24]  Eduardo Camina,et al.  The Neuroanatomical, Neurophysiological and Psychological Basis of Memory: Current Models and Their Origins , 2017, Front. Pharmacol..

[25]  Tajana Simunic,et al.  Efficient human activity recognition using hyperdimensional computing , 2018, IOT.

[26]  Friedrich T. Sommer,et al.  Robust computation with rhythmic spike patterns , 2019, Proceedings of the National Academy of Sciences.

[27]  Mohsen Imani,et al.  Cognitive Correlative Encoding for Genome Sequence Matching in Hyperdimensional System , 2021, 2021 58th ACM/IEEE Design Automation Conference (DAC).

[28]  Luca Benini,et al.  Efficient Biosignal Processing Using Hyperdimensional Computing: Network Templates for Combined Learning and Classification of ExG Signals , 2019, Proceedings of the IEEE.

[29]  Hannes Rapp,et al.  A spiking neural program for sensorimotor control during foraging in flying insects , 2020, Proceedings of the National Academy of Sciences.

[30]  Kaushik Roy,et al.  Towards spike-based machine intelligence with neuromorphic computing , 2019, Nature.

[31]  Evangelos Eleftheriou,et al.  Deep learning incorporating biologically inspired neural dynamics and in-memory computing , 2020, Nature Machine Intelligence.

[32]  Arash Fayyazi,et al.  SynergicLearning: Neural Network-Based Feature Extraction for Highly-Accurate Hyperdimensional Learning , 2020, 2020 IEEE/ACM International Conference On Computer Aided Design (ICCAD).

[33]  Lei Deng,et al.  Spatio-Temporal Backpropagation for Training High-Performance Spiking Neural Networks , 2017, Front. Neurosci..

[34]  Bernhard Schölkopf,et al.  The Kernel Trick for Distances , 2000, NIPS.

[35]  Douglas Summers-Stay,et al.  Symbolic Representation and Learning With Hyperdimensional Computing , 2020, Frontiers in Robotics and AI.

[36]  Tobi Delbrück,et al.  A Low Power, Fully Event-Based Gesture Recognition System , 2017, 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[37]  Mohsen Imani,et al.  OnlineHD: Robust, Efficient, and Single-Pass Online Learning Using Hyperdimensional System , 2021, 2021 Design, Automation & Test in Europe Conference & Exhibition (DATE).

[38]  Jan M. Rabaey,et al.  Vector Symbolic Architectures as a Computing Framework for Nanoscale Hardware , 2021, ArXiv.

[39]  Pentti Kanerva,et al.  Encoding Structure in Boolean Space , 1998 .

[40]  Evgeny Osipov,et al.  Brain-like classifier of temporal patterns , 2014, 2014 International Conference on Computer and Information Sciences (ICCOINS).

[41]  B. Rajendran,et al.  Arbitrary Spike Time Dependent Plasticity (STDP) in Memristor by Analog Waveform Engineering , 2017, IEEE Electron Device Letters.

[42]  Sung-Tae Lee,et al.  Neuromorphic Computing Using NAND Flash Memory Architecture With Pulse Width Modulation Scheme , 2020, Frontiers in Neuroscience.

[43]  Mohsen Imani,et al.  DUAL: Acceleration of Clustering Algorithms using Digital-based Processing In-Memory , 2020, 2020 53rd Annual IEEE/ACM International Symposium on Microarchitecture (MICRO).

[44]  Andrea Klug,et al.  The Hippocampus Book , 2016 .

[45]  Pentti Kanerva,et al.  Hyperdimensional Computing: An Introduction to Computing in Distributed Representation with High-Dimensional Random Vectors , 2009, Cognitive Computation.

[46]  Houbing Song,et al.  Internet of Things and Big Data Analytics for Smart and Connected Communities , 2016, IEEE Access.

[47]  Hyoseung Kim,et al.  Pipelined Data-Parallel CPU/GPU Scheduling for Multi-DNN Real-Time Inference , 2019, 2019 IEEE Real-Time Systems Symposium (RTSS).

[48]  H. Sompolinsky,et al.  Sparseness and Expansion in Sensory Representations , 2014, Neuron.

[49]  Giacomo Indiveri,et al.  Frontiers in Neuromorphic Engineering , 2011, Front. Neurosci..

[50]  G. Handelmann,et al.  Hippocampus, space, and memory , 1979 .

[51]  Hesham Mostafa,et al.  Surrogate Gradient Learning in Spiking Neural Networks: Bringing the Power of Gradient-based optimization to spiking neural networks , 2019, IEEE Signal Processing Magazine.

[52]  Johannes Schemmel,et al.  Implementing Synaptic Plasticity in a VLSI Spiking Neural Network Model , 2006, The 2006 IEEE International Joint Conference on Neural Network Proceedings.

[53]  Luca Benini,et al.  PULP-HD: Accelerating Brain-Inspired High-Dimensional Computing on a Parallel Ultra-Low Power Platform , 2018, 2018 55th ACM/ESDA/IEEE Design Automation Conference (DAC).

[54]  Richard C. Atkinson,et al.  Human Memory: A Proposed System and its Control Processes , 1968, Psychology of Learning and Motivation.

[55]  Tajana Simunic,et al.  A Framework for Collaborative Learning in Secure High-Dimensional Space , 2019, 2019 IEEE 12th International Conference on Cloud Computing (CLOUD).

[56]  Terrence J. Sejnowski,et al.  Simple framework for constructing functional spiking recurrent neural networks , 2019, Proceedings of the National Academy of Sciences.

[57]  Sascha Jockel,et al.  Crossmodal learning and prediction of autobiographical episodic experiences using a sparse distributed memory , 2010 .

[58]  Jan M. Rabaey,et al.  High-Dimensional Computing as a Nanoscalable Paradigm , 2017, IEEE Transactions on Circuits and Systems I: Regular Papers.

[59]  J. Rabaey,et al.  A wearable biosensing system with in-sensor adaptive machine learning for hand gesture recognition , 2020, Nature Electronics.

[60]  Andrew S. Cassidy,et al.  Conversion of artificial recurrent neural networks to spiking neural networks for low-power neuromorphic hardware , 2016, 2016 IEEE International Conference on Rebooting Computing (ICRC).

[61]  Jan M. Rabaey,et al.  Hyperdimensional Computing for Blind and One-Shot Classification of EEG Error-Related Potentials , 2020, Mob. Networks Appl..

[62]  Jacques Kaiser,et al.  Synaptic Plasticity Dynamics for Deep Continuous Local Learning (DECOLLE) , 2018, Frontiers in Neuroscience.

[63]  Grace W. Lindsay Convolutional Neural Networks as a Model of the Visual System: Past, Present, and Future , 2020, Journal of Cognitive Neuroscience.

[64]  Chris Eliasmith,et al.  Spiking Deep Networks with LIF Neurons , 2015, ArXiv.

[65]  Shogo Yonekura,et al.  Spike-induced ordering: Stochastic neural spikes provide immediate adaptability to the sensorimotor system , 2020, Proceedings of the National Academy of Sciences.

[66]  Mingguo Zhao,et al.  A system hierarchy for brain-inspired computing , 2020, Nature.

[67]  Yiannis Aloimonos,et al.  Learning sensorimotor control with neuromorphic sensors: Toward hyperdimensional active perception , 2019, Science Robotics.

[68]  Okko Johannes Räsänen,et al.  Sequence Prediction With Sparse Distributed Hyperdimensional Coding Applied to the Analysis of Mobile Phone Use Patterns , 2016, IEEE Transactions on Neural Networks and Learning Systems.

[69]  Yusuf Leblebici,et al.  Neuromorphic computing with multi-memristive synapses , 2017, Nature Communications.

[70]  Okko Johannes Räsänen,et al.  Modeling Dependencies in Multiple Parallel Data Streams with Hyperdimensional Computing , 2014, IEEE Signal Processing Letters.

[71]  Arkady B. Zaslavsky,et al.  Sensing as a Service and Big Data , 2013, ArXiv.

[72]  Chris Eliasmith,et al.  Legendre Memory Units: Continuous-Time Representation in Recurrent Neural Networks , 2019, NeurIPS.