Dendritic neurons can perform linearly separable computations with low resolution synaptic weights.

In theory, neurons modelled as single layer perceptrons can implement all linearly separable computations. In practice, however, these computations may require arbitrarily precise synaptic weights. This is a strong constraint since both biological neurons and their artificial counterparts have to cope with limited precision. Here, we explore how non-linear processing in dendrites helps overcome this constraint. We start by finding a class of computations which requires increasing precision with the number of inputs in a Perceptron and show that it can be implemented without this constraint in a neuron with sub-linear dendritic subunits. Then, we complement this analytical study by a simulation of a biophysical neuron model with two passive dendrites and a soma, and show that it can implement this computation. This work demonstrates a new role of dendrites in neural computation: by distributing the computation across independent subunits, the same computation can be performed more efficiently with less precise tuning of the synaptic weights. This work not only offers new insight into the importance of dendrites for biological neurons, but also paves the way for new, more efficient architectures of artificial neuromorphic chips.

[1]  D H HUBEL,et al.  RECEPTIVE FIELDS AND FUNCTIONAL ARCHITECTURE IN TWO NONSTRIATE VISUAL AREAS (18 AND 19) OF THE CAT. , 1965, Journal of neurophysiology.

[2]  W S McCulloch,et al.  A logical calculus of the ideas immanent in nervous activity , 1990, The Philosophy of Artificial Intelligence.

[3]  Romain Brette,et al.  Brian 2: an intuitive and efficient neural simulator , 2019, bioRxiv.

[4]  J. Magee,et al.  Somatic EPSP amplitude is independent of synapse location in hippocampal pyramidal neurons , 2000, Nature Neuroscience.

[5]  Bartlett W. Mel,et al.  Computational subunits in thin dendrites of pyramidal cells , 2004, Nature Neuroscience.

[6]  Saburo Muroga,et al.  Threshold logic and its applications , 1971 .

[7]  Sorin Draghici,et al.  On the capabilities of neural networks using limited precision weights , 2002, Neural Networks.

[8]  Marvin Minsky,et al.  Perceptrons: An Introduction to Computational Geometry , 1969 .

[9]  Bartlett W. Mel,et al.  Pyramidal Neuron as Two-Layer Neural Network , 2003, Neuron.

[10]  M. Larkum,et al.  Dendritic action potentials and computation in human layer 2/3 cortical neurons , 2020, Science.

[11]  L. Cathala,et al.  Thin Dendrites of Cerebellar Interneurons Confer Sublinear Synaptic Integration and a Gradient of Short-Term Plasticity , 2012, Neuron.

[12]  Panayiota Poirazi,et al.  Challenging the point neuron dogma: FS basket cells as 2-stage nonlinear integrators , 2018 .

[13]  Boris S. Gutkin,et al.  Spiking and saturating dendrites differentially expand single neuron computation capacity , 2012, NIPS.

[14]  Bartlett W. Mel,et al.  Mechanisms underlying subunit independence in pyramidal neuron dendrites , 2013, Proceedings of the National Academy of Sciences.

[15]  Christof Koch,et al.  The role of single neurons in information processing , 2000, Nature Neuroscience.

[16]  Johannes Schemmel,et al.  Is a 4-Bit Synaptic Weight Resolution Enough? – Constraints on Enabling Spike-Timing Dependent Plasticity in Neuromorphic Hardware , 2012, Front. Neurosci..

[17]  R. Traub,et al.  Neuronal Networks of the Hippocampus , 1991 .

[18]  Romain D. Cazé,et al.  Dendrites enable a robust mechanism for neuronal stimulus selectivity , 2016, bioRxiv.

[19]  Johan Håstad,et al.  On the Size of Weights for Threshold Gates , 1994, SIAM J. Discret. Math..

[20]  R. Guillery,et al.  On the actions that one nerve cell can have on another: distinguishing "drivers" from "modulators". , 1998, Proceedings of the National Academy of Sciences of the United States of America.