Spiking Neural Networks Through the Lens of Streaming Algorithms

We initiate the study of biological neural networks from the perspective of streaming algorithms. Like computers, human brains suffer from memory limitations which pose a significant obstacle when processing large scale and dynamically changing data. In computer science, these challenges are captured by the well-known streaming model, which can be traced back to Munro and Paterson `78 and has had significant impact in theory and beyond. In the classical streaming setting, one must compute some function $f$ of a stream of updates $\mathcal{S} = \{u_1,\ldots,u_m\}$, given restricted single-pass access to the stream. The primary complexity measure is the space used by the algorithm. We take the first steps towards understanding the connection between streaming and neural algorithms. On the upper bound side, we design neural algorithms based on known streaming algorithms for fundamental tasks, including distinct elements, approximate median, heavy hitters, and more. The number of neurons in our neural solutions almost matches the space bounds of the corresponding streaming algorithms. As a general algorithmic primitive, we show how to implement the important streaming technique of linear sketching efficient in spiking neural networks. On the lower bound side, we give a generic reduction, showing that any space-efficient spiking neural network can be simulated by a space-efficiently streaming algorithm. This reduction lets us translate streaming-space lower bounds into nearly matching neural-space lower bounds, establishing a close connection between these two models.

[1]  Wolfgang Maass,et al.  On the Computational Power of Winner-Take-All , 2000, Neural Computation.

[2]  Tobi Delbrück,et al.  Training Deep Spiking Neural Networks Using Backpropagation , 2016, Front. Neurosci..

[3]  Noga Alon,et al.  The space complexity of approximating the frequency moments , 1996, STOC '96.

[4]  David P. Woodruff Optimal space lower bounds for all frequency moments , 2004, SODA '04.

[5]  Nancy A. Lynch,et al.  Computational Tradeoffs in Biological Neural Networks: Self-Stabilizing Winner-Take-All Networks , 2016, ITCS.

[6]  David P. Woodruff,et al.  An optimal algorithm for the distinct elements problem , 2010, PODS '10.

[7]  Philippe Flajolet,et al.  Loglog Counting of Large Cardinalities (Extended Abstract) , 2003, ESA.

[8]  Aaron Sidford,et al.  Fast and Space Efficient Spectral Sparsification in Dynamic Streams , 2020, SODA.

[9]  David P. Woodruff,et al.  Turnstile streaming algorithms might as well be linear sketches , 2014, STOC.

[10]  Santosh S. Vempala,et al.  Brain Computation: A Computer Science Perspective , 2019, Computing and Software Science.

[11]  Merav Parter,et al.  Counting to Ten with Two Fingers: Compressed Counting with Spiking Neurons , 2019, ESA.

[12]  Noam Nisan,et al.  Hardness vs Randomness , 1994, J. Comput. Syst. Sci..

[13]  Sanjoy Dasgupta,et al.  A neural algorithm for a fundamental computing problem , 2017 .

[14]  David P. Woodruff,et al.  A Tight Lower Bound for High Frequency Moment Estimation with Small Error , 2013, APPROX-RANDOM.

[15]  Bruce G. Lindsay,et al.  Approximate medians and other quantiles in one pass and with limited memory , 1998, SIGMOD '98.

[16]  Kai-Min Chung,et al.  On the Algorithmic Power of Spiking Neural Networks , 2018, ITCS.

[17]  John Kallaugher,et al.  Separations and equivalences between turnstile streaming and linear sketching , 2019, STOC.

[18]  Merav Parter,et al.  The Computational Cost of Asynchronous Neural Communication , 2020, ITCS.

[19]  Larry Carter,et al.  Universal Classes of Hash Functions , 1979, J. Comput. Syst. Sci..

[20]  Wolfgang Maass,et al.  Networks of Spiking Neurons: The Third Generation of Neural Network Models , 1996, Electron. Colloquium Comput. Complex..

[21]  Nancy A. Lynch,et al.  Spike-Based Winner-Take-All Computation: Fundamental Limits and Order-Optimal Circuits , 2019, Neural Computation.

[22]  Graham Cormode,et al.  An improved data stream summary: the count-min sketch and its applications , 2004, J. Algorithms.

[23]  Avi Wigderson,et al.  P = BPP if E requires exponential circuits: derandomizing the XOR lemma , 1997, STOC '97.

[24]  Timothée Masquelier,et al.  Deep Learning in Spiking Neural Networks , 2018, Neural Networks.

[25]  Piotr Indyk,et al.  Stable distributions, pseudorandom generators, embeddings, and data stream computation , 2006, JACM.

[26]  Nancy A. Lynch,et al.  Integrating Temporal Information to Spatial Information in a Neural Circuit , 2019, DISC.

[27]  J. Ian Munro,et al.  Selection and sorting with limited storage , 1978, 19th Annual Symposium on Foundations of Computer Science (sfcs 1978).

[28]  Aoqian Zhang,et al.  A Survey of Approximate Quantile Computation on Large-Scale Data , 2020, IEEE Access.

[29]  Piotr Indyk,et al.  K-median clustering, model-based compressive sensing, and sparse recovery for earth mover distance , 2011, STOC '11.

[30]  David P. Woodruff,et al.  Tight lower bounds for the distinct elements problem , 2003, 44th Annual IEEE Symposium on Foundations of Computer Science, 2003. Proceedings..

[31]  Santosh S. Vempala,et al.  Long Term Memory and the Densest K-Subgraph Problem , 2018, ITCS.

[32]  P. Flajolet,et al.  HyperLogLog: the analysis of a near-optimal cardinality estimation algorithm , 2007 .

[33]  Edo Liberty,et al.  Optimal Quantile Approximation in Streams , 2016, 2016 IEEE 57th Annual Symposium on Foundations of Computer Science (FOCS).

[34]  Santosh S. Vempala,et al.  Random Projection in the Brain and Computation with Assemblies of Neurons , 2019, ITCS.

[35]  Nancy A. Lynch,et al.  Neuro-RAM Unit with Applications to Similarity Testing and Compression in Spiking Neural Networks , 2017, DISC.

[36]  Luca Trevisan,et al.  Counting Distinct Elements in a Data Stream , 2002, RANDOM.

[37]  Moses Charikar,et al.  Finding frequent items in data streams , 2002, Theor. Comput. Sci..

[38]  Piotr Indyk,et al.  Comparing Data Streams Using Hamming Norms (How to Zero In) , 2002, VLDB.

[39]  Leslie G. Valiant,et al.  Capacity of Neural Networks for Lifelong Learning of Composable Tasks , 2017, 2017 IEEE 58th Annual Symposium on Foundations of Computer Science (FOCS).

[40]  Wofgang Maas,et al.  Networks of spiking neurons: the third generation of neural network models , 1997 .

[41]  S. Muthukrishnan,et al.  Data streams: algorithms and applications , 2005, SODA '03.

[42]  Jaroslaw Blasiok,et al.  Optimal Streaming and Tracking Distinct Elements with High Probability , 2018, SODA.

[43]  P. Chassaing,et al.  Efficient estimation of the cardinality of large data sets , 2007, math/0701347.

[44]  Merav Parter,et al.  Random Sketching, Clustering, and Short-Term Memory in Spiking Neural Networks , 2020, ITCS.

[45]  Nancy Lynch,et al.  Spiking Neural Networks : An Algorithmic Perspective ( Extended Abstract ) ∗ , 2017 .

[46]  Wolfgang Maass,et al.  On the Computational Power of Noisy Spiking Neurons , 1995, NIPS.