How bad are the BFGS and DFP methods when the objective function is quadratic?
暂无分享,去创建一个
We study the use of the BFGS and DFP algorithms with step-lengths of one for minimizing quadratic functions of only two variables. The updating formulae in this case imply nonlinear three term recurrence relations between the eigenvalues of consecutive second derivative approximations, which are analysed in order to explain some gross inefficiencies that can occur. Specifically, the BFGS algorithm may require more than 10 iterations to achieve the first decimal place of accuracy, while the performance of the DFP method is far worse. The results help to explain why the DFP method is often less suitable than the BFGS algorithm for general unconstrained optimization calculations, and they show that quadratic functions provide much information about efficiency when the current vector of variables is too far from the solution for an asymptotic convergence analysis.
[1] R. Fletcher,et al. A New Approach to Variable Metric Algorithms , 1970, Comput. J..
[2] T. M. Williams,et al. Practical Methods of Optimization. Vol. 1: Unconstrained Optimization , 1980 .
[3] R. Fletcher. Practical Methods of Optimization , 1988 .