An Unconstrained Optimization Algorithm Which Uses Function and Gradient Values

A new method for unconstrained optimization is presented. It consists of a modification of Powell''s 1970 dogleg strategy with the approximate Hessian given by Davidson''s 1975 updating scheme which uses the projections of $\triangle x$ and $\triangle g$ in updating H and G and optimizes the condition number of $H^{-1}H_{+}$. This new algorithm performs well without Powell''s special iterations and singularity safeguards. Only symmetric and positive definite updates to the Hessian are used.