Here we analyze synaptic transmission from an information-theoretic perspective. We derive closed-form expressions for the lower-bounds on the capacity of a simple model of a cortical synapse under two explicit coding paradigms. Under the "signal estimation" paradigm, we assume the signal to be encoded in the mean firing rate of a Poisson neuron. The performance of an optimal linear estimator of the signal then provides a lower bound on the capacity for signal estimation. Under the "signal detection" paradigm, the presence or absence of the signal has to be detected. Performance of the optimal spike detector allows us to compute a lower bound on the capacity for signal detection. We find that single synapses (for empirically measured parameter values) transmit information poorly but significant improvement can be achieved with a small amount of redundancy.
[1]
C. Stevens,et al.
Origin of variability in quantal size in cultured hippocampal neurons and hippocampal slices.
,
1990,
Proceedings of the National Academy of Sciences of the United States of America.
[2]
Thomas M. Cover,et al.
Elements of Information Theory
,
2005
.
[3]
William Bialek,et al.
Spikes: Exploring the Neural Code
,
1996
.
[4]
Joel L. Davis,et al.
Large-Scale Neuronal Theories of the Brain
,
1994
.
[5]
D. Faber,et al.
Quantal analysis and synaptic efficacy in the CNS
,
1991,
Trends in Neurosciences.
[6]
H. Markram,et al.
Redistribution of synaptic efficacy between neocortical pyramidal neurons
,
1996,
Nature.
[7]
William Bialek,et al.
Reading a Neural Code
,
1991,
NIPS.