A descent hybrid conjugate gradient method based on the memoryless BFGS update

In this work, we present a new hybrid conjugate gradient method based on the approach of the convex hybridization of the conjugate gradient update parameters of DY and HS+, adapting a quasi-Newton philosophy. The computation of the hybrization parameter is obtained by minimizing the distance between the hybrid conjugate gradient direction and the self-scaling memoryless BFGS direction. Furthermore, a significant property of our proposed method is that it ensures sufficient descent independent of the accuracy of the line search. The global convergence of the proposed method is established provided that the line search satisfies the Wolfe conditions. Our numerical experiments on a set of unconstrained optimization test problems from the CUTEr collection indicate that our proposed method is preferable and in general superior to classic conjugate gradient methods in terms of efficiency and robustness.

[1]  Stephen J. Wright,et al.  Numerical Optimization , 2018, Fundamental Statistical Inference.

[2]  C. Storey,et al.  Efficient generalized conjugate gradient algorithms, part 1: Theory , 1991 .

[3]  Aleksey K. Alekseev,et al.  Comparison of advanced large-scale minimization algorithms for the solution of inverse ill-posed problems , 2009, Optim. Methods Softw..

[4]  François-Xavier Le Dimet,et al.  Numerical Experience with Limited-Memory Quasi-Newton and Truncated Newton Methods , 1993, SIAM J. Optim..

[5]  M. J. D. Powell,et al.  Restart procedures for the conjugate gradient method , 1977, Math. Program..

[6]  Ya-Xiang Yuan,et al.  An Efficient Hybrid Conjugate Gradient Method for Unconstrained Optimization , 2001, Ann. Oper. Res..

[7]  Li Zhang,et al.  Global convergence of a modified Fletcher–Reeves conjugate gradient method with Armijo-type line search , 2006, Numerische Mathematik.

[8]  M. Powell Nonconvex minimization calculations and the conjugate gradient method , 1984 .

[9]  Jorge Nocedal,et al.  Analysis of a self-scaling quasi-Newton method , 1993, Math. Program..

[10]  Shmuel S. Oren,et al.  Self-scaling variable metric algorithms for unconstrained minimization , 1972 .

[11]  Panayiotis E. Pintelas,et al.  A limited memory descent Perry conjugate gradient method , 2016, Optim. Lett..

[12]  M. Al-Baali Numerical Experience with a Class of Self-Scaling Quasi-Newton Algorithms , 1998 .

[13]  D. Luenberger,et al.  SELF-SCALING VARIABLE METRIC ( SSVM ) ALGORITHMS Part I : Criteria and Sufficient Conditions for Scaling a Class of Algorithms * t , 2007 .

[14]  A. Perry A Class of Conjugate Gradient Algorithms with a Two-Step Variable Metric Memory , 1977 .

[15]  Panayiotis E. Pintelas,et al.  A new conjugate gradient algorithm for training neural networks based on a modified secant equation , 2013, Appl. Math. Comput..

[16]  T. M. Williams,et al.  Practical Methods of Optimization. Vol. 1: Unconstrained Optimization , 1980 .

[17]  C. Storey,et al.  Global convergence result for conjugate gradient methods , 1991 .

[18]  Ya-Xiang Yuan,et al.  A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property , 1999, SIAM J. Optim..

[19]  Robert Plato,et al.  The conjugate gradient method for linear ill-posed problems with operator perturbations , 1999, Numerical Algorithms.

[20]  Yu-Hong Dai,et al.  A Nonlinear Conjugate Gradient Algorithm with an Optimal Property and an Improved Wolfe Line Search , 2013, SIAM J. Optim..

[21]  Angela Kunoth,et al.  A wavelet-based nested iteration–inexact conjugate gradient algorithm for adaptively solving elliptic PDEs , 2008, Numerical Algorithms.

[22]  C. M. Reeves,et al.  Function minimization by conjugate gradients , 1964, Comput. J..

[23]  D. Luenberger,et al.  Self-Scaling Variable Metric (SSVM) Algorithms , 1974 .

[24]  Neculai Andrei,et al.  Accelerated hybrid conjugate gradient algorithm with modified secant condition for unconstrained optimization , 2010, Numerical Algorithms.

[25]  Jorge Nocedal,et al.  Theory of algorithms for unconstrained optimization , 1992, Acta Numerica.

[26]  C. X. Kou,et al.  A Modified Self-Scaling Memoryless Broyden–Fletcher–Goldfarb–Shanno Method for Unconstrained Optimization , 2015, J. Optim. Theory Appl..

[27]  J. Nocedal Updating Quasi-Newton Matrices With Limited Storage , 1980 .

[28]  Reza Ghanbari,et al.  Two optimal Dai–Liao conjugate gradient methods , 2015 .

[29]  M. Hestenes,et al.  Methods of conjugate gradients for solving linear systems , 1952 .

[30]  Reza Ghanbari,et al.  A class of adaptive Dai–Liao conjugate gradient methods based on the scaled memoryless BFGS update , 2017, 4OR.

[31]  Neculai Andrei,et al.  Another hybrid conjugate gradient algorithm for unconstrained optimization , 2008, Numerical Algorithms.

[32]  D. G. Sotiropoulos,et al.  A memoryless BFGS neural network training algorithm , 2009, 2009 7th IEEE International Conference on Industrial Informatics.

[33]  E. Polak,et al.  Note sur la convergence de méthodes de directions conjuguées , 1969 .

[34]  Reza Ghanbari,et al.  A hybridization of the Polak-Ribière-Polyak and Fletcher-Reeves conjugate gradient methods , 2014, Numerical Algorithms.

[35]  N. Andrei Hybrid Conjugate Gradient Algorithm for Unconstrained Optimization , 2009 .

[36]  Qunfeng Liu,et al.  Two minimal positive bases based direct search conjugate gradient methods for computationally expensive functions , 2011, Numerical Algorithms.

[37]  Neculai Andrei,et al.  Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization , 2007, Optim. Methods Softw..

[38]  Jorge J. Moré,et al.  Benchmarking optimization software with performance profiles , 2001, Math. Program..

[39]  Jorge Nocedal,et al.  Global Convergence Properties of Conjugate Gradient Methods for Optimization , 1992, SIAM J. Optim..

[40]  Reza Ghanbari,et al.  A hybridization of the Hestenes–Stiefel and Dai–Yuan conjugate gradient methods based on a least-squares approach , 2015, Optim. Methods Softw..

[41]  Emilio Spedicato,et al.  Broyden's quasi-Newton methods for a nonlinear system of equations and unconstrained optimization: a review and open problems , 2014, Optim. Methods Softw..

[42]  D. Shanno On the Convergence of a New Conjugate Gradient Algorithm , 1978 .

[43]  D. Touati-Ahmed,et al.  Efficient hybrid conjugate gradient techniques , 1990 .

[44]  Jorge Nocedal,et al.  Enriched Methods for Large-Scale Unconstrained Optimization , 2002, Comput. Optim. Appl..

[45]  Reza Ghanbari,et al.  The Dai-Liao nonlinear conjugate gradient method with optimal parameter choices , 2014, Eur. J. Oper. Res..

[46]  Nezam Mahdavi-Amiri,et al.  Two effective hybrid conjugate gradient algorithms based on modified BFGS updates , 2011, Numerical Algorithms.

[47]  Nezam Mahdavi-Amiri,et al.  Two Modified Hybrid Conjugate Gradient Methods Based on a Hybrid Secant Equation , 2013 .

[48]  M. Al-Baali Descent Property and Global Convergence of the Fletcher—Reeves Method with Inexact Line Search , 1985 .

[49]  Mehiddin Al-Baali Global and Superlinear Convergence of a Restricted Class of Self-Scaling Methods with Inexact Line Searches, for Convex Functions , 1998, Comput. Optim. Appl..

[50]  Nicholas I. M. Gould,et al.  CUTE: constrained and unconstrained testing environment , 1995, TOMS.

[51]  S. Nash Newton-Type Minimization via the Lanczos Method , 1984 .

[52]  Jorge Nocedal,et al.  On the limited memory BFGS method for large scale optimization , 1989, Math. Program..

[53]  W. Hager,et al.  A SURVEY OF NONLINEAR CONJUGATE GRADIENT METHODS , 2005 .

[54]  S. Nash Preconditioning of Truncated-Newton Methods , 1985 .

[55]  Christian Rey,et al.  Iterative accelerating algorithms with Krylov subspaces for the solution to large-scale nonlinear problems , 2004, Numerical Algorithms.

[56]  William W. Hager,et al.  A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search , 2005, SIAM J. Optim..

[57]  X. Wu,et al.  Conjugate Gradient Method for Rank Deficient Saddle Point Problems , 2004, Numerical Algorithms.

[58]  Li Zhang,et al.  Two descent hybrid conjugate gradient methods for optimization , 2008 .

[59]  M. Al-Albaalt,et al.  Extra updates for the bfgs method , 2000 .

[60]  Shmuel S. Oren,et al.  Optimal conditioning of self-scaling variable Metric algorithms , 1976, Math. Program..

[61]  Yuhong Dai Nonlinear Conjugate Gradient Methods , 2011 .

[62]  Reza Ghanbari,et al.  Two hybrid nonlinear conjugate gradient methods based on a modified secant equation , 2014 .