An efficient modified Polak–Ribière–Polyak conjugate gradient method with global convergence properties

The conjugate gradient (CG) method is one of the most popular methods for solving large-scale unconstrained optimization problems. In this paper, a new modified version of the CG formula that was introduced by Polak, Ribière, and Polyak is proposed for problems that are bounded below and have a Lipschitz-continuous gradient. The new parameter provides global convergence properties when the strong Wolfe-Powell (SWP) line search or the weak Wolfe-Powell (WWP) line search is employed. A proof of a sufficient descent condition is provided for the SWP line search. Numerical comparisons between the proposed parameter and other recent CG modifications are made on a set of standard unconstrained optimization problems. The numerical results demonstrate the efficiency of the proposed CG parameter compared with the other CG parameters.

[1]  Jorge Nocedal,et al.  Global Convergence Properties of Conjugate Gradient Methods for Optimization , 1992, SIAM J. Optim..

[2]  K. Schittkowski,et al.  NONLINEAR PROGRAMMING , 2022 .

[3]  M. Al-Baali Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search , 1985 .

[4]  Yuhong Dai Nonlinear Conjugate Gradient Methods , 2011 .

[5]  William W. Hager,et al.  The Limited Memory Conjugate Gradient Method , 2013, SIAM J. Optim..

[6]  Duan Li,et al.  On Restart Procedures for the Conjugate Gradient Method , 2004, Numerical Algorithms.

[7]  M. Hestenes,et al.  Methods of conjugate gradients for solving linear systems , 1952 .

[8]  Reza Ghanbari,et al.  A hybridization of the Hestenes–Stiefel and Dai–Yuan conjugate gradient methods based on a least-squares approach , 2015, Optim. Methods Softw..

[9]  Nicholas I. M. Gould,et al.  CUTE: constrained and unconstrained testing environment , 1995, TOMS.

[10]  Ya-Xiang Yuan,et al.  Convergence properties of Beale-Powell restart algorithm , 1998 .

[11]  M. Powell Nonconvex minimization calculations and the conjugate gradient method , 1984 .

[12]  Hiroshi Yabe,et al.  A family of three-term conjugate gradient methods with sufficient descent property for unconstrained optimization , 2015, Comput. Optim. Appl..

[13]  E. Polak,et al.  Note sur la convergence de méthodes de directions conjuguées , 1969 .

[14]  Yue Xiao,et al.  A modified Polak–Ribi‘ere–Polyak descent method for unconstrained optimization , 2014, Optim. Methods Softw..

[15]  Jorge Nocedal,et al.  On the limited memory BFGS method for large scale optimization , 1989, Math. Program..

[16]  Liuguanghui,et al.  GLOBAL CONVERGENCE OF THE FLETCHER-REEVES ALGORITHM WITH INEXACT LINESEARCH , 1995 .

[17]  Jorge J. Moré,et al.  Digital Object Identifier (DOI) 10.1007/s101070100263 , 2001 .

[18]  C. M. Reeves,et al.  Function minimization by conjugate gradients , 1964, Comput. J..

[19]  W. Hager,et al.  A SURVEY OF NONLINEAR CONJUGATE GRADIENT METHODS , 2005 .

[20]  William W. Hager,et al.  A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search , 2005, SIAM J. Optim..

[21]  Gonglin Yuan,et al.  A Modified Hestenes-Stiefel Conjugate Gradient Algorithm for Large-Scale Optimization , 2013 .