SpikeProp: backpropagation for networks of spiking neurons

For a network of spiking neurons with reasonable postsynaptic potentials, we derive a supervised learning rule akin to traditional error-back-propagation, SpikeProp and show how to overcome the discontinuities introduced by thresholding. Using this learning algorithm, we demonstrate how networks of spiking neurons with biologically plausible time-constants can perform complex non-linear classi cation in fast temporal coding just as well as rate-coded networks. When comparing the (implicit) number of neurons required for the respective encodings, it is empirically demonstrated that temporal coding potentially requires signi cantly less neurons.