Variations on Variable-Metric Methods

In unconstrained minimization of a functions, the method of Davidon-Fletcher- Powell (a "variable-metric" method) enables the inverse of the Hessian H of f to be ap- proximated stepwise, using only values of the gradient of f. It is shown here that, by solv- ing a certain variational problem, formulas for the successive corrections to H can be de- rived which closely resemble Davidon's. A symmetric correction matrix is sought which minimizes a weighted Euclidean norm, and also satisfies the "DFP condition." Numerical tests are described, comparing the performance (on four "standard" test functions) of two variationally-derived formulas with Davidon's. A proof by Y. Bard, modelled on Fletcher and Powell's, showing that the new formulas give the exact H after N steps, is included in an appendix. 1. The DFP Method. The class of gradient methods for finding the uncon- strained minimum of a function f(x)* in which the direction Sk of the next iterative step from Xk to Xk+l is computed from a formula such as: