Tuning of learning rate and momentum on backpropagation

Summary form only given. In the backpropagation process, the amendment of interconnecting weights given to the units of the input layer or hidden layer is calculated by the momentum ( alpha ) and the learning rate ( eta ). The number of training cycles, therefore, depends on alpha and eta , so that it is necessary to choose the most suitable values for alpha and eta . By changing alpha and eta , the authors tried to search for the most suitable values for the learning. The combinations alpha and eta behave under the constant rule, which is represented by eta =K(1- alpha ). Moreover, the constant K is determined by the ratio between the number of output units and hidden units. This conclusion is very important for deciding the size of a neural network.<<ETX>>

[1]  Akira Iwata,et al.  A study of a divided learning method , 1990, 1990 IJCNN International Joint Conference on Neural Networks.