A new class of spectral conjugate gradient methods based on a modified secant equation for unconstrained optimization

Conjugate gradient methods have played a special role for solving large scale optimization problems due to the simplicity of their iteration, convergence properties and their low memory requirements. In this work, we propose a new class of spectral conjugate gradient methods which ensures sufficient descent independent of the accuracy of the line search. Moreover, an attractive property of our proposed methods is that they achieve a high-order accuracy in approximating the second order curvature information of the objective function by utilizing the modified secant condition proposed by Babaie-Kafaki et al. [S. Babaie-Kafaki, R. Ghanbari, N. Mahdavi-Amiri, Two new conjugate gradient methods based on modified secant equations, Journal of Computational and Applied Mathematics 234 (2010) 1374-1386]. Further, a global convergence result for general functions is established provided that the line search satisfies the Wolfe conditions. Our numerical experiments indicate that our proposed methods are preferable and in general superior to the classical conjugate gradient methods in terms of efficiency and robustness.

[1]  Wenyu Sun,et al.  Global convergence of nonmonotone descent methods for unconstrained optimization problems , 2002 .

[2]  E. Polak,et al.  Note sur la convergence de méthodes de directions conjuguées , 1969 .

[3]  Ya-Xiang Yuan,et al.  Convergence Properties of Nonlinear Conjugate Gradient Methods , 1999, SIAM J. Optim..

[4]  M. J. D. Powell,et al.  Restart procedures for the conjugate gradient method , 1977, Math. Program..

[5]  Reza Ghanbari,et al.  Two new conjugate gradient methods based on modified secant equations , 2010, J. Comput. Appl. Math..

[6]  Jinhua Guo,et al.  A new family of conjugate gradient methods , 2009 .

[7]  Zhen-Jun Shi,et al.  Convergence of nonmonotone line search method , 2006 .

[8]  J. M. Martínez,et al.  A Spectral Conjugate Gradient Method for Unconstrained Optimization , 2001 .

[9]  William W. Hager,et al.  A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search , 2005, SIAM J. Optim..

[10]  Luigi Grippo,et al.  A globally convergent version of the Polak-Ribière conjugate gradient method , 1997, Math. Program..

[11]  Hiroshi Yabe,et al.  Multi-step nonlinear conjugate gradient methods for unconstrained minimization , 2008, Comput. Optim. Appl..

[12]  G. G. Stokes "J." , 1890, The New Yale Book of Quotations.

[13]  W. Hager,et al.  A SURVEY OF NONLINEAR CONJUGATE GRADIENT METHODS , 2005 .

[14]  Dong Wang,et al.  Notes on the Dai-Yuan-Yuan modified spectral gradient method , 2010, J. Comput. Appl. Math..

[15]  Gonglin Yuan,et al.  Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems , 2009, Optim. Lett..

[16]  Marcos Raydan,et al.  The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem , 1997, SIAM J. Optim..

[17]  Yuhong Dai Nonlinear Conjugate Gradient Methods , 2011 .

[18]  Jorge J. Moré,et al.  Digital Object Identifier (DOI) 10.1007/s101070100263 , 2001 .

[19]  Nezam Mahdavi-Amiri,et al.  Two effective hybrid conjugate gradient algorithms based on modified BFGS updates , 2011, Numerical Algorithms.

[20]  Issam A. R. Moghrabi,et al.  Multi-step quasi-Newton methods for optimization , 1994 .

[21]  Wufan Chen,et al.  Spectral conjugate gradient methods with sufficient descent property for large-scale unconstrained optimization , 2008, Optim. Methods Softw..

[23]  Guoyin Li,et al.  New conjugacy condition and related new conjugate gradient methods for unconstrained optimization , 2007 .

[24]  Avinoam Perry,et al.  Technical Note - A Modified Conjugate Gradient Algorithm , 1978, Oper. Res..

[25]  Weijun Zhou,et al.  A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence , 2006 .

[26]  Gonglin Yuan,et al.  The Superlinear Convergence of a Modified BFGS-Type Method for Unconstrained Optimization , 2004, Comput. Optim. Appl..

[27]  Jorge Nocedal,et al.  Global Convergence Properties of Conjugate Gradient Methods for Optimization , 1992, SIAM J. Optim..

[28]  Issam A. R. Moghrabi,et al.  Using function-values in multi-step quasi-Newton methods , 1996 .

[29]  Jianzhon Zhang,et al.  Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations , 2001 .

[30]  Li Zhang,et al.  Global convergence of a modified Fletcher–Reeves conjugate gradient method with Armijo-type line search , 2006, Numerische Mathematik.

[31]  A. Perry A Modified Conjugate Gradient Algorithm for Unconstrained Nonlinear Optimization , 1975 .

[32]  J. Borwein,et al.  Two-Point Step Size Gradient Methods , 1988 .

[33]  M. Hestenes,et al.  Methods of conjugate gradients for solving linear systems , 1952 .

[34]  C. M. Reeves,et al.  Function minimization by conjugate gradients , 1964, Comput. J..

[35]  Stephen J. Wright,et al.  Numerical Optimization , 2018, Fundamental Statistical Inference.

[36]  Li Zhang,et al.  Some descent three-term conjugate gradient methods and their global convergence , 2007, Optim. Methods Softw..

[37]  Jingfeng Zhang,et al.  New Quasi-Newton Equation and Related Methods for Unconstrained Optimization , 1999 .