A new descent algorithm with curve search rule

In the paper we present a new descent algorithm with curve search rule for unconstrained minimization problems. At each iteration, the next iterative point is determined by means of a curve search rule. It is particular that the search direction and the step-size is determined simultaneously at each iteration of the new algorithm. The algorithm, similarly to conjugate gradient methods, avoids the computation and storage of some matrices associated with the Hessian of objective functions. Unlike ODE methods, it is not required to solve ordinary differential equations at each iteration. Although the convergence rate of our algorithm is not as fast as that of Newton-like methods, it is suitable to solve large scale minimization problems.

[1]  H. Y. Huang,et al.  Quadratically convergent algorithms and one-dimensional search schemes , 1973 .

[2]  J. Ford,et al.  New implicit updates in multi-step quasi-Newton methods for unconstrained optimisation , 2003 .

[3]  David Mautner Himmelblau,et al.  Applied Nonlinear Programming , 1972 .

[4]  C. M. Reeves,et al.  Function minimization by conjugate gradients , 1964, Comput. J..

[5]  M. A. Wolfe,et al.  Supermemory descent methods for unconstrained minimization , 1976 .

[6]  M. J. D. Powell,et al.  Restart procedures for the conjugate gradient method , 1977, Math. Program..

[7]  D. J. van Wyk Differential optimization techniques , 1984 .

[8]  A. Miele,et al.  Study on a memory gradient method for the minimization of functions , 1969 .

[9]  M. J. D. Powell,et al.  Some convergence properties of the conjugate gradient method , 1976, Math. Program..

[10]  A. I. Cohen Stepsize analysis for descent methods , 1973, CDC 1973.

[11]  Yurij G. Evtushenko,et al.  Numerical Optimization Techniques , 1985 .

[12]  A. V. Levy,et al.  Study on a supermemory gradient method for the minimization of functions , 1969 .

[13]  Jianlin Xia,et al.  Note on global convergence of ODE methods for unconstrained optimization , 2002, Appl. Math. Comput..

[14]  Luigi Grippo,et al.  A globally convergent version of the Polak-Ribière conjugate gradient method , 1997, Math. Program..

[15]  M. N. Vrahatisa,et al.  A class of gradient unconstrained minimization algorithms with adaptive stepsize , 1999 .

[16]  J. W. Cantrell Relation between the memory gradient method and the Fletcher-Reeves method , 1969 .

[17]  Issam A. R. Moghrabi,et al.  Minimum curvature multistep quasi-Newton methods , 1996 .

[18]  Johannes Schropp A note on minimization problems and multistep methods , 1997 .

[19]  Jorge Nocedal,et al.  Global Convergence Properties of Conjugate Gradient Methods for Optimization , 1992, SIAM J. Optim..

[20]  Dimitri P. Bertsekas,et al.  Constrained Optimization and Lagrange Multiplier Methods , 1982 .

[21]  L. Grippo,et al.  A class of nonmonotone stabilization methods in unconstrained optimization , 1991 .

[22]  Issam A. R. Moghrabi,et al.  Using function-values in multi-step quasi-Newton methods , 1996 .

[23]  仇永平,et al.  A New Descent Method for Unconstrained Optimization Problem , 2000 .

[24]  L. C. W. Dixon Conjugate Directions without Linear Searches , 1973 .

[25]  J. Snyman A new and dynamic method for unconstrained minimization , 1982 .

[26]  D K Smith,et al.  Numerical Optimization , 2001, J. Oper. Res. Soc..

[27]  Magnus R. Hestenes,et al.  Conjugate Direction Methods in Optimization , 1980 .

[28]  C. Botsaris Differential gradient methods , 1978 .