An optimum weights initialization for improving scaling relationships in BP learning
暂无分享,去创建一个
An algorithm for fast minimum search is proposed, which reaches very satisfying performances by making both the learning rate and the momentum term adaptive in an optimum way, and by executing controls and corrections both on the possible cost function increase and on eventual moves opposite to the direction of the negative of the gradient.
The global minimum search by restarting the algorithm is furthermore accelerated through an optimum criterion of initialization of the weights based on testing the neurons paralyzed state, that is when the magnitudes of both a neuron output and the error output are greater than a fixed threshold. Thanks to these improvements, we can obtain scaling relationships in learning more favourable than those previously obtained by Tesauro and Janssen.
[1] Gerald Tesauro,et al. Scaling Relationships in Back-propagation Learning , 1988, Complex Syst..
[2] Scott E. Fahlman,et al. An empirical study of learning speed in back-propagation networks , 1988 .
[3] C. M. Reeves,et al. Function minimization by conjugate gradients , 1964, Comput. J..