On the Convergence of the Variable Metric Method with Numerical Derivatives and the Effect of Noise in the Function Evaluation
暂无分享,去创建一个
The effect of numerical estimates of the gradient on Wolfe’s convergency proof for descent algorithms is considered. It is shown that there is a value eo such that if the termination criteria is set greater than this value then convergence is assured, but that below this value the behaviour is uncertain. This theoretical result is in agreement with previously published experimental results. A modified scheme for the search along a line is described which both enables a better estimate of the initial slope to be made, and then implies that convergence to a more accurate point can be achieved. It is known that variable metric methods are very unreliable when the function evaluation is subject to noise. It is shown that this is due to two reasons. The numerical estimates of the gradient become very unreliable and the standard line search strategy fails. The new linear search overcomes the second of these difficulties.
[1] P. Wolfe. Convergence Conditions for Ascent Methods. II: Some Corrections , 1971 .
[2] M. Powell. On the Convergence of the Variable Metric Algorithm , 1971 .
[3] D. J. Evans,et al. Software for Numerical Mathematics , 1975 .