A Geometric Approach to the Structural Synthesis of Multilayer Perceptron Neural Networks

Designing a multilayer perceptron for general purpose classification has important practical implications. Since the capacity of multilayer perceptron to realize arbitrary dichotomies (or two-class classifications) is limited, the most important step in a design procedure is the determination of the number of the layers and the amount of nodes in each layer apart from the determination of the weights and the threshold values. Unfortunately, there has been no general principle or guideline available for such a synthesis task, normal design often proceeds on an ad hoc and empirical basis, the methods generally lead to the structure which only deals with a particular classification problem [1] [2].

[1]  G. Mirchandani,et al.  On hidden nodes for neural nets , 1989 .

[2]  J. Slawny,et al.  Back propagation fails to separate where perceptrons succeed , 1989 .

[3]  P. Rujan,et al.  A geometric approach to learning in neural networks , 1989, International 1989 Joint Conference on Neural Networks.

[4]  Eric B. Baum,et al.  On the capabilities of multilayer perceptrons , 1988, J. Complex..

[5]  U. Ramacher,et al.  A geometrical approach to neural network design , 1989, International 1989 Joint Conference on Neural Networks.