Stochastic Hebbian learning with binary synapses

This paper explores a variant of Hebbian learning in which binary synapses are updated stochastically rather than deterministically. In this variant, a single potentiation or depression event is implemented by setting a synapse weight respectively to "one" or "zero" with a finite probability, if it is not this value already. This learning rule is compared to the conventional Hebbian rule where a continuously valued synapse moves a fraction towards 1.0 or 0.0. It is shown that given a set of input-output pattern pairs, the expected value of a particular synapse is the same for both learning rules. Also, as the network size and the input activity levels increase, the signal to noise ratio of the dendritic sums approaches infinity. These stochastic binary synapses are presented as a viable mechanism for the VLSI implementation of Hebbian-based neural networks.