Self-scaling variable metric algorithms without line search for unconstrained minimization

This paper introduces a new class of quasi-Newton algorithms for uncon- strained minimization in which no line search is necessary and the inverse Hessian approxi- mations are positive definite. These algorithms are based on a two-parameter family of rank two, updating formulae used earlier with line search in self-scaling variable metric algorithms. It is proved that, in a quadratic case, the new algorithms converge at least weak super- linearly. A special case of the above algorithms was implemented and tested numerically on several test functions. In this implementation, however, cubic interpolation was performed whenever the objective function was not satisfactorily decreased on the first "shot" (with unit step size), but this did not occur too often, except for very difficult functions. The numerical results indicate that the new algorithm is competitive and often superior to previous methods. 1. Introduction. This paper addresses the problem of minimizing a smooth real valued function f(x) depending on an n-dimensional vector x, assuming the avail- ability of the gradients Vf(x) = g(x) for any given x. An important class of algorithms for solving this problem is the quasi-Newton methods also known as variable metric algorithms. In these methods, the successive points are obtained by the equation (1) Xk+1 = Xk -aDkDkqk,

[1]  L. Kantorovich,et al.  Functional analysis in normed spaces , 1952 .

[2]  H. H. Rosenbrock,et al.  An Automatic Method for Finding the Greatest or Least Value of a Function , 1960, Comput. J..

[3]  Roger Fletcher,et al.  A Rapidly Convergent Descent Method for Minimization , 1963, Comput. J..

[4]  Alston S. Householder,et al.  The Theory of Matrices in Numerical Analysis , 1964 .

[5]  A. Goldstein On Steepest Descent , 1965 .

[6]  C. G. Broyden Quasi-Newton methods and their application to function minimisation , 1967 .

[7]  Y. Bard On a numerical instability of Davidon-like methods , 1968 .

[8]  William C. Davidon,et al.  Variance Algorithm for Minimization , 1968, Comput. J..

[9]  H. Y. Huang Unified approach to quadratically convergent algorithms for function minimization , 1970 .

[10]  C. G. Broyden The Convergence of a Class of Double-rank Minimization Algorithms 1. General Considerations , 1970 .

[11]  J. Greenstadt Variations on variable-metric methods. (With discussion) , 1970 .

[12]  R. Fletcher,et al.  A New Approach to Variable Metric Algorithms , 1970, Comput. J..

[13]  C. G. Broyden The Convergence of a Class of Double-rank Minimization Algorithms 2. The New Algorithm , 1970 .

[14]  J. Greenstadt Variations on Variable-Metric Methods , 1970 .

[15]  M. J. D. Powell,et al.  Recent advances in unconstrained optimization , 1971, Math. Program..

[16]  Shmuel S. Oren,et al.  Self-scaling variable metric algorithms for unconstrained minimization , 1972 .

[17]  L. Dixon Variable metric algorithms: Necessary and sufficient conditions for identical behavior of nonquadratic functions , 1972 .

[18]  P. Gill,et al.  Quasi-Newton Methods for Unconstrained Optimization , 1972 .

[19]  E. Polak Introduction to linear and nonlinear programming , 1973 .

[20]  D. Luenberger,et al.  Self-Scaling Variable Metric (SSVM) Algorithms , 1974 .

[21]  S. Oren SELF-SCALING VARIABLE METRIC (SSVM) ALGORITHMS Part II: Implementation and Experiments*t , 1974 .