Singularity of Multi-Layered Neural Networks on Back-Propagation
暂无分享,去创建一个
It i.s well known that Neural Networks (NN) with BackPropagation (BP) is used for recognition and learning. The basic networks have three layers, input layer, one hidden layer and output layer. And the scale of 3-layered NN depends on the number of h idden layer units. ( fixed the number of input and output layer units on NN) In this paper we make multi(4,5)-layer ed NN with four or five layers on BP (input layer,two or three hidden layers, output layer) and try to compare 3-layered NN and multi-layered NN, in terms of t.he convergence. As a result, the convergence of multilayered BIN is very low compared with 3-layered NN. However, multi-layered NN extract two meanings from learning data such as shape and density. 1. INTRODUCTION At 'present the scale of Neural Networks (NN) is small. The basic NN have three layers, i nput l ayer, one hidden layer and output layer. We need the way to increase combination o f the path from input: layer to output layer on NN to study more complicated data. We have two ways to increase the combination, to increase number of hidden units and to increase number of hidden layers. In case Me make NN to study many data, we generally i ncrease the number of hidden units. In this paper we make multi layered NN with four: or five layers ( input layer, two or three hidden layers arid output layer) by increasing hidden layers. 3-layered NN extract feature of learning data from only one hidden layer, but multi-layer NN can extract it from two or three hidden layers. So it: is possible that multi layered NN deal with more complicated data. But the meaning of increasing layers and the role of layers are not clear. As a result, multi-layered NN is more better than 3-layered NN in recognition multi-valued data. But multi-layered NN are not good in terms of the convergence compared with 3-layered NN because of falling into local minimum. So we suggest the way to improve convergent rate of learning.
[1] Akira Iwata,et al. A study of a divided learning method , 1990, 1990 IJCNN International Joint Conference on Neural Networks.