Algorithm 500: Minimization of Unconstrained Multivariate Functions [E4]

where g(k) = Vf(x(k)), the gradient of f a t x Ck), H (k) is a matr ix which is designed to approximate the inverse Hessian matr ix of f a t x (k), and a (k) is an appropr ia te ly chosen scalar. The sequence of matr ices H (k) is chosen to satisfy the quas i -Newton equation H ( k + l ) y (k) = a (k), where a (~) = x (k+~) x (~) and y(k) = g(k+~) _ g(k). I n general, H (k+~) is generated by H (k+~) = H (k) ~D (k), where D (k) is chosen to satisfy the equat ion D(k)y(k) = a(~) H(k)y(k) . The choice of a Ck) can be accomplished either by a linear-search or a step-length method. Perhaps the mos t complete current reference to the general theory is Powell [-8]. Two different methods for determining a (k) are incorporated and are outlined below. The Broyden-Fle tcher -Shanno (BFS) matr ix upda te (developed