Circuits composed of threshold gates (McCulloch-Pitts neurons, or perceptrons) are simplified models of neural circuits with the advantage that they are theoretically more tractable than their biological counterparts. However, when such threshold circuits are designed to perform a specific computational task, they usually differ in one important respect from computations in the brain: they require very high activity. On average every second threshold gate fires (sets a 1 as output) during a computation. By contrast, the activity of neurons in the brain is much sparser, with only about 1 of neurons firing. This mismatch between threshold and neuronal circuits is due to the particular complexity measures (circuit size and circuit depth) that have been minimized in previous threshold circuit constructions. In this letter, we investigate a new complexity measure for threshold circuits, energy complexity, whose minimization yields computations with sparse activity. We prove that all computations by threshold circuits of polynomial size with entropy O(log n) can be restructured so that their energy complexity is reduced to a level near the entropy of circuit states. This entropy of circuit states is a novel circuit complexity measure, which is of interest not only in the context of threshold circuits but for circuit complexity in general. As an example of how this measure can be applied, we show that any polynomial size threshold circuit with entropy O(log n) can be simulated by a polynomial size threshold circuit of depth 3. Our results demonstrate that the structure of circuits that result from a minimization of their energy complexity is quite different from the structure that results from a minimization of previously considered complexity measures, and potentially closer to the structure of neural circuits in the nervous system. In particular, different pathways are activated in these circuits for different classes of inputs. This letter shows that such circuits with sparse activity have a surprisingly large computational power.
[1]
P. Lennie.
The Cost of Cortical Computation
,
2003,
Current Biology.
[2]
K. Siu,et al.
Theoretical Advances in Neural Computation and Learning
,
1994,
Springer US.
[3]
Ian Parberry,et al.
Circuit complexity and neural networks
,
1994
.
[4]
T. Kailath,et al.
Discrete Neural Computation: A Theoretical Foundation
,
1995
.
[5]
Pavel Pudlák,et al.
Threshold circuits of bounded depth
,
1987,
28th Annual Symposium on Foundations of Computer Science (sfcs 1987).
[6]
B. Sakmann,et al.
In vivo, low-resistance, whole-cell recordings from neurons in the anaesthetized and awake mammalian brain
,
2002,
Pflügers Archiv.
[7]
Gloria Kissin.
Upper and lower bounds on switching energy in VLSI
,
1991,
JACM.
[8]
John H. Reif,et al.
Energy complexity of optical computations
,
1990,
Proceedings of the Second IEEE Symposium on Parallel and Distributed Processing 1990.
[9]
György Turán,et al.
On Linear Decision Trees Computing Boolean Functions
,
1991,
ICALP.
[10]
Marvin Minsky,et al.
Perceptrons: An Introduction to Computational Geometry
,
1969
.
[11]
Bruno A. Olshausen,et al.
A multiscale dynamic routing circuit for forming size- and position-invariant object representations
,
1995,
Journal of Computational Neuroscience.
[12]
O. Cheremisin.
On the activity of cell circuits realising the system of all conjunctions
,
2003
.