New Quasi-Newton Equation and Related Methods for Unconstrained Optimization

AbstractIn unconstrained optimization, the usual quasi-Newton equation is Bk+1sk=yk, where yk is the difference of the gradients at the last two iterates. In this paper, we propose a new quasi-Newton equation, $$B_{k + 1} s_k = \tilde y_k $$ , in which $$\tilde y_k $$ is based on both the function values and gradients at the last two iterates. The new equation is superior to the old equation in the sense that $$\tilde y_k $$ better approximates ∇ 2f(xk+1)sk than yk. Modified quasi-Newton methods based on the new quasi-Newton equation are locally and superlinearly convergent. Extensive numerical experiments have been conducted which show that the new quasi-Newton methods are encouraging.