Abstract An algorithm was recently presented that minimizes a nonlinear function in several variables using a Newton-type curvilinear search path. In order to determine this curvilinear search path the eigenvalue problem of the Hessian matrix of the objective function has to be solved at each iteration of the algorithm. In this paper an iterative procedure requiring gradient information only is developed for the approximation of the eigensystem of the Hessian matrix. It is shown that for a quadratic function the approximated eigenvalues and eigenvectors tend rapidly to the actual eigenvalues and eigenvectors of its Hessian matrix. The numerical tests indicate that the resulting algorithm is very fast and stable. Moreover, the fact that some approximations to the eigenvectors of the Hessian matrix are available is used to get past saddle points and accelerate the rate of convergence on flat functions.
[1]
R. Fletcher,et al.
A New Approach to Variable Metric Algorithms
,
1970,
Comput. J..
[2]
DIFFERENTIAL DESCENT METHODS FOR FUNCTION MINIMIZATION
,
1975
.
[3]
J. H. Wilkinson.
The algebraic eigenvalue problem
,
1966
.
[4]
C. G. Broyden.
Quasi-Newton methods and their application to function minimisation
,
1967
.
[5]
C. Botsaris.
Differential gradient methods
,
1978
.
[6]
David Mautner Himmelblau,et al.
Applied Nonlinear Programming
,
1972
.
[7]
D. Jacobson,et al.
A Newton-type curvilinear search method for optimization
,
1976
.
[8]
Roger Fletcher,et al.
A Rapidly Convergent Descent Method for Minimization
,
1963,
Comput. J..