Lower bound for connectivity in local-learning neural networks

Abstract How does the connectivity of a neural network (number of synapses per neuron) relate to the complexity of the problems it can handle? Switching theory would suggest no relation at all, since all Boolean functions can be implemented using a circuit with very low connectivity (e.g., using two-input NAND gates). However, for a network that learns a problem from examples using a local learning rule, we prove that the entropy of the problem becomes a lower bound for the connectivity of the network. The current result generalizes a previous result by removing a restriction on the features that are loaded into the neurons during the learning phase.