Efficient modeling for multilayer feed-forward neural nets

The authors discuss two important aspects in multilayer feed-forward neural nets: the optimal number of hidden units per layer, and the optimal number of synaptic weights between two adjacent layers. On the basis of simulations, they conjecture that the optimal number of hidden units shall be equal to or a little bit more than M-1 for efficient learning, where M is the number of pairs of training patterns used. Locally interconnected nets may be useful for some real applications where geometrical properties are significant. By introducing highway links into the locally interconnected nets, the convergence speed can be improved significantly.<<ETX>>