Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization

A scaled memoryless BFGS preconditioned conjugate gradient algorithm for solving unconstrained optimization problems is presented. The basic idea is to combine the scaled memoryless BFGS method and the preconditioning technique in the frame of the conjugate gradient method. The preconditioner, which is also a scaled memoryless BFGS matrix, is reset when the Beale–Powell restart criterion holds. The parameter scaling the gradient is selected as the spectral gradient. In very mild conditions, it is shown that, for strongly convex functions, the algorithm is globally convergent. Computational results for a set consisting of 750 unconstrained optimization test problems show that this new scaled conjugate gradient algorithm substantially outperforms the known conjugate gradient methods including the spectral conjugate gradient by Birgin and Martínez [Birgin, E. and Martínez, J.M., 2001, A spectral conjugate gradient method for unconstrained optimization. Applied Mathematics and Optimization, 43, 117–128], the conjugate gradient by Polak and Ribière [Polak, E. and Ribière, G., 1969, Note sur la convergence de méthodes de directions conjuguées. Revue Francaise Informat. Reserche Opérationnelle, 16, 35–43], as well as the most recent conjugate gradient method with guaranteed descent by Hager and Zhang [Hager, W.W. and Zhang, H., 2005, A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM Journal on Optimization, 16, 170–192; Hager, W.W. and Zhang, H., 2004, CG-DESCENT, A conjugate gradient method with guaranteed descent ACM Transactions on Mathematical Software, 32, 113–137].

[1]  Nicholas I. M. Gould,et al.  CUTE: constrained and unconstrained testing environment , 1995, TOMS.

[2]  P. Wolfe Convergence Conditions for Ascent Methods. II , 1969 .

[3]  M. Hestenes,et al.  Methods of conjugate gradients for solving linear systems , 1952 .

[4]  Duan Li,et al.  On Restart Procedures for the Conjugate Gradient Method , 2004, Numerical Algorithms.

[5]  Ya-Xiang Yuan,et al.  Convergence properties of Beale-Powell restart algorithm , 1998 .

[6]  David F. Shanno,et al.  Conjugate Gradient Methods with Inexact Searches , 1978, Math. Oper. Res..

[7]  Marcos Raydan,et al.  The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem , 1997, SIAM J. Optim..

[8]  M. J. D. Powell,et al.  Some convergence properties of the conjugate gradient method , 1976, Math. Program..

[9]  L. Liao,et al.  New Conjugacy Conditions and Related Nonlinear Conjugate Gradient Methods , 2001 .

[10]  D. Luenberger,et al.  Self-Scaling Variable Metric (SSVM) Algorithms , 1974 .

[11]  E. Polak,et al.  Note sur la convergence de méthodes de directions conjuguées , 1969 .

[12]  J. M. Martínez,et al.  A Spectral Conjugate Gradient Method for Unconstrained Optimization , 2001 .

[13]  David F. Shanno,et al.  Algorithm 500: Minimization of Unconstrained Multivariate Functions [E4] , 1976, TOMS.

[14]  A. Perry A Class of Conjugate Gradient Algorithms with a Two-Step Variable Metric Memory , 1977 .

[15]  P. Wolfe Convergence Conditions for Ascent Methods. II: Some Corrections , 1971 .

[16]  William W. Hager,et al.  Algorithm 851: CG_DESCENT, a conjugate gradient method with guaranteed descent , 2006, TOMS.

[17]  William W. Hager,et al.  A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search , 2005, SIAM J. Optim..

[18]  Shmuel S. Oren,et al.  Optimal conditioning of self-scaling variable Metric algorithms , 1976, Math. Program..

[19]  C. M. Reeves,et al.  Function minimization by conjugate gradients , 1964, Comput. J..

[20]  D. Shanno On the Convergence of a New Conjugate Gradient Algorithm , 1978 .

[21]  K. Shadan,et al.  Available online: , 2012 .