Singularity of multilayered neural networks on backpropagation

It is well known that neural networks (NN) with backpropagation (BP) are used for recognition and learning. The basic networks have three layers, input layer, one hidden layer and output layer, and the scale a of 3-layered NN depends on the number of hidden layer units (fixed number of input and output layer units on NN). In this paper the authors make a multi (4,5)-layered NN with four or five layers on BP (input layer, two or three hidden layers, output layer) and try to compare a 3-layered NN and a multi-layered NN, in terms of the convergence. As a result, the convergence of a multilayered NN is very low compared with a 3-layered NN. However, a multilayered NN extracts two meanings from learning data such as shape and density,.

[1]  Akira Iwata,et al.  A study of a divided learning method , 1990, 1990 IJCNN International Joint Conference on Neural Networks.