An optimum weights initialization for improving scaling relationships in BP learning

An algorithm for fast minimum search is proposed, which reaches very satisfying performances by making both the learning rate and the momentum term adaptive in an optimum way, and by executing controls and corrections both on the possible cost function increase and on eventual moves opposite to the direction of the negative of the gradient. The global minimum search by restarting the algorithm is furthermore accelerated through an optimum criterion of initialization of the weights based on testing the neurons paralyzed state, that is when the magnitudes of both a neuron output and the error output are greater than a fixed threshold. Thanks to these improvements, we can obtain scaling relationships in learning more favourable than those previously obtained by Tesauro and Janssen.