Approximation by neural networks with a bounded number of nodes at each level

It is shown that the general approximation property of feed-forward multilayer perceptron networks can be achieved in networks where the number of nodes in each layer is bounded, but the number of layers grows to infinity. This is the case provided the node function is twice continuously differentiable and not linear.