Remark on “Algorithm 500: Minimization of Unconstrained Multivariate Functions [E4]”

The subroutine incorporates two nonlinear optimization methods, a conjugate gradient algorithm and a variable metric algorithm, with the choice of method left to the user. The conjugate gradient algorithm is the Beale restarted memoryless variable metric algorithm documented in Shanno [7]. This method requires approximately 7n double-precision words of working storage to be provided by the user. The variable metric method is the BFGS algorithm with initial scaling documented in Shanno and Phua [10], and required approximately n2/2 + l l n /2 double-precision words of working storage. Whichever method is chosen, the same linear search technique is used for both methods, with two differences. The basic linear search uses Davidon's cubic interpolation to find a step length a, which satisfies