A new three-term conjugate gradient algorithm for unconstrained optimization

A new three-term conjugate gradient algorithm which satisfies both the descent condition and the conjugacy condition is presented. The algorithm is obtained by minimization the one-parameter quadratic model of the objective function in which the symmetrical approximation of the Hessian matrix satisfies the general quasi-Newton equation. The search direction is obtained by symmetrization of the iteration matrix corresponding to the solution of the quadratic model minimization. Using the general quasi-Newton equation the search direction includes a parameter which is determined by the minimization of the condition number of the iteration matrix. It is proved that this direction satisfies both the conjugacy and the descent condition. The new approximation of the minimum is obtained by the general Wolfe line search using by now a standard acceleration technique. Under standard assumptions, for uniformly convex functions the global convergence of the algorithm is proved. The numerical experiments using 800 large-scale unconstrained optimization test problems show that minimization of the condition number of the iteration matrix lead us to a value of the parameter in the search direction able to define a competitive three-term conjugate gradient algorithm. Numerical comparisons of this variant of the algorithm versus known conjugate gradient algorithms ASCALCG, CONMIN, TTCG and THREECG, as well as the limited memory quasi-Newton algorithm LBFGS (m = 5) and the truncated Newton TN show that our algorithm is indeed more efficient and more robust.

[1]  Avinoam Perry,et al.  Technical Note - A Modified Conjugate Gradient Algorithm , 1978, Oper. Res..

[2]  Weijun Zhou,et al.  A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence , 2006 .

[3]  Jorge Nocedal,et al.  On the limited memory BFGS method for large scale optimization , 1989, Math. Program..

[4]  Genqi Xu,et al.  Symmetric Perry conjugate gradient method , 2013, Comput. Optim. Appl..

[5]  L. Nazareth A conjugate direction algorithm without line searches , 1977 .

[6]  Ya-Xiang Yuan,et al.  A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property , 1999, SIAM J. Optim..

[7]  Yunhai Xiao,et al.  Nonlinear Conjugate Gradient Methods with Sufficient Descent Condition for Large-Scale Unconstrained Optimization , 2009 .

[8]  Jorge J. Moré,et al.  Digital Object Identifier (DOI) 10.1007/s101070100263 , 2001 .

[9]  L. Liao,et al.  New Conjugacy Conditions and Related Nonlinear Conjugate Gradient Methods , 2001 .

[10]  William W. Hager,et al.  A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search , 2005, SIAM J. Optim..

[11]  Neculai Andrei,et al.  Acceleration of conjugate gradient algorithms for unconstrained optimization , 2009, Appl. Math. Comput..

[12]  David J. Thuente,et al.  Line search algorithms with guaranteed sufficient decrease , 1994, TOMS.

[13]  M. Powell Nonconvex minimization calculations and the conjugate gradient method , 1984 .

[14]  W. Cheng A Two-Term PRP-Based Descent Method , 2007 .

[15]  Neculai Andrei,et al.  On three-term conjugate gradient algorithms for unconstrained optimization , 2013, Appl. Math. Comput..

[16]  Zhengfeng Li,et al.  Global convergence of three terms conjugate gradient methods , 1994 .

[17]  N. Andrei A modified Polak–Ribière–Polyak conjugate gradient algorithm for unconstrained optimization , 2011 .

[18]  Hiroshi Yabe,et al.  A Three-Term Conjugate Gradient Method with Sufficient Descent Property for Unconstrained Optimization , 2011, SIAM J. Optim..

[19]  Hiroshi Yabe,et al.  Globally Convergent Three-Term Conjugate Gradient Methods that Use Secant Conditions and Generate Descent Search Directions for Unconstrained Optimization , 2011, Journal of Optimization Theory and Applications.

[20]  P. Wolfe Convergence Conditions for Ascent Methods. II , 1969 .

[21]  Li Zhang,et al.  Some descent three-term conjugate gradient methods and their global convergence , 2007, Optim. Methods Softw..

[22]  David F. Shanno,et al.  Algorithm 500: Minimization of Unconstrained Multivariate Functions [E4] , 1976, TOMS.

[23]  L. Dixon,et al.  A new three-term conjugate gradient method , 1985 .

[24]  Jorge Nocedal,et al.  A Numerical Study of the Limited Memory BFGS Method and the Truncated-Newton Method for Large Scale Optimization , 1991, SIAM J. Optim..

[25]  A. Perry A Modified Conjugate Gradient Algorithm for Unconstrained Nonlinear Optimization , 1975 .

[26]  J. Nocedal Updating Quasi-Newton Matrices With Limited Storage , 1980 .

[27]  M. Hestenes,et al.  Methods of conjugate gradients for solving linear systems , 1952 .

[28]  P. Wolfe Convergence Conditions for Ascent Methods. II: Some Corrections , 1971 .

[29]  Jorge Nocedal Conjugate Gradient Methods and Nonlinear Optimization , 1996 .

[30]  Neculai Andrei,et al.  Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization , 2007, Optim. Methods Softw..

[31]  Neculai Andrei,et al.  A simple three-term conjugate gradient algorithm for unconstrained optimization , 2013, J. Comput. Appl. Math..

[32]  Claude Lemaréchal,et al.  Some numerical experiments with variable-storage quasi-Newton algorithms , 1989, Math. Program..

[33]  Siam Rfview,et al.  CONVERGENCE CONDITIONS FOR ASCENT METHODS , 2016 .

[34]  Neculai Andrei,et al.  Scaled conjugate gradient algorithms for unconstrained optimization , 2007, Comput. Optim. Appl..

[35]  Yu-Hong Dai,et al.  A Nonlinear Conjugate Gradient Algorithm with an Optimal Property and an Improved Wolfe Line Search , 2013, SIAM J. Optim..