A curvilinear optimization method based upon iterative estimation of the eigensystem of the Hessian matrix

Abstract An algorithm was recently presented that minimizes a nonlinear function in several variables using a Newton-type curvilinear search path. In order to determine this curvilinear search path the eigenvalue problem of the Hessian matrix of the objective function has to be solved at each iteration of the algorithm. In this paper an iterative procedure requiring gradient information only is developed for the approximation of the eigensystem of the Hessian matrix. It is shown that for a quadratic function the approximated eigenvalues and eigenvectors tend rapidly to the actual eigenvalues and eigenvectors of its Hessian matrix. The numerical tests indicate that the resulting algorithm is very fast and stable. Moreover, the fact that some approximations to the eigenvectors of the Hessian matrix are available is used to get past saddle points and accelerate the rate of convergence on flat functions.