A note on memory-less SR1 and memory-less BFGS methods for large-scale unconstrained optimization

The memory-less SR1 and the memory-less BFGS methods are presented together with their numerical performances for solving a set of 800 unconstrained optimization problems with the number of variables in the range [1000, 10,000]. By memory-less quasi-Newton methods, we understand the quasi-Newton methods which are initialized by the unity matrix at every iteration. In these algorithms, the stepsize is computed by the Wolfe line search conditions. The convergence of the memory-less SR1 method is proved under the classical assumptions. Comparison between the memory-less SR1 and the memory-less BFGS method shows that memory-less BFGS is more efficient and more robust than the memory-less SR1. Comparison between memory-less SR1 and BFGS from CONMIN in implementation of Shanno and Phua shows that memory-less SR1 method is more efficient and more robust than BFGS method from CONMIN, one of the best implementation of BFGS. Additionally, comparison of memory-less SR1 and memory-less BFGS versus steepest descent shows that both these memory-less algorithms are more efficient and more robust. Performances of these algorithms for solving five applications from MINPACK-2 collection, each of them with 40,000 variables, are also presented. For solving these applications, the memory-less BFGS is more efficient than the memory-less SR1. It seems that the accuracy of the Hessian approximations along the iterations in quasi-Newton methods is not as crucial in these methods as it is believed.

[1]  D. Shanno On the Convergence of a New Conjugate Gradient Algorithm , 1978 .

[2]  D. Shanno Conditioning of Quasi-Newton Methods for Function Minimization , 1970 .

[3]  C. G. Broyden The Convergence of a Class of Double-rank Minimization Algorithms 1. General Considerations , 1970 .

[4]  M. Powell A New Algorithm for Unconstrained Optimization , 1970 .

[5]  H. Yabe,et al.  Memoryless quasi-Newton methods based on spectral-scaling Broyden family for unconstrained optimization , 2019, Journal of Industrial & Management Optimization.

[6]  A. U. Moyi,et al.  A sufficient descent three-term conjugate gradient method via symmetric rank-one update for large-scale optimization , 2016 .

[7]  David F. Shanno,et al.  Conjugate Gradient Methods with Inexact Searches , 1978, Math. Oper. Res..

[8]  Roger Fletcher,et al.  A Rapidly Convergent Descent Method for Minimization , 1963, Comput. J..

[9]  Huiming Chen,et al.  On the Convergence Analysis of Cubic Regularized Symmetric Rank-1 Quasi-Newton Method and the Incremental Version in the Application of Large-Scale Problems , 2019, IEEE Access.

[10]  Wah June Leong,et al.  Memoryless modified symmetric rank-one method for large-scale unconstrained optimization , 2009 .

[11]  Richard H. Byrd,et al.  A Theoretical and Experimental Study of the Symmetric Rank-One Update , 1993, SIAM J. Optim..

[12]  Saman Babaie-Kafaki A modified scaling parameter for the memoryless BFGS updating formula , 2015, Numerical Algorithms.

[13]  H. Yabe,et al.  A MEMORYLESS SYMMETRIC RANK-ONE METHOD WITH SUFFICIENT DESCENT PROPERTY FOR UNCONSTRAINED OPTIMIZATION , 2018 .

[14]  G. Rosen The mathematical theory of diffusion and reaction in permeable catalysts , 1976 .

[15]  R. Kohn,et al.  Numerical study of a relaxed variational problem from optimal design , 1986 .

[16]  D. Goldfarb A family of variable-metric methods derived by variational means , 1970 .

[17]  David F. Shanno,et al.  Algorithm 500: Minimization of Unconstrained Multivariate Functions [E4] , 1976, TOMS.

[18]  G. Cimatti On a problem of the theory of lubrication governed by a variational inequality , 1976 .

[19]  Neculai Andrei,et al.  An acceleration of gradient descent algorithm with backtracking for unconstrained optimization , 2006, Numerical Algorithms.

[20]  Jerrold Bebernes,et al.  Mathematical Problems from Combustion Theory , 1989 .

[21]  P. Wolfe Convergence Conditions for Ascent Methods. II: Some Corrections , 1971 .

[22]  Wah June Leong,et al.  Scaled memoryless symmetric rank one method for large-scale optimization , 2011, Appl. Math. Comput..

[23]  C. G. Broyden Quasi-Newton methods and their application to function minimisation , 1967 .

[24]  Ekkehard W. Sachs,et al.  Local Convergence of the Symmetric Rank-One Iteration , 1995, Comput. Optim. Appl..

[25]  Jorge Nocedal,et al.  Theory of algorithms for unconstrained optimization , 1992, Acta Numerica.

[26]  David F. Shanno,et al.  Cubic regularization in symmetric rank-1 quasi-Newton methods , 2018, Math. Program. Comput..

[27]  Jorge J. Moré,et al.  Benchmarking optimization software with performance profiles , 2001, Math. Program..

[28]  H. Y. Huang Unified approach to quadratically convergent algorithms for function minimization , 1970 .

[29]  Robert Osserman,et al.  Lectures on Minimal Surfaces. , 1991 .

[30]  Guoliang Xue,et al.  The MINPACK-2 test problem collection , 1992 .

[31]  R. Kellogg A nonlinear alternating direction method , 1969 .

[32]  R. Glowinski,et al.  Numerical Methods for Nonlinear Variational Problems , 1985 .

[33]  Nicholas I. M. Gould,et al.  CUTE: constrained and unconstrained testing environment , 1995, TOMS.

[34]  Y. -H. Dai,et al.  New Conjugacy Conditions and Related Nonlinear Conjugate Gradient Methods , 2001 .

[35]  Nicholas I. M. Gould,et al.  Convergence of quasi-Newton matrices generated by the symmetric rank one update , 1991, Math. Program..

[36]  William C. Davidon,et al.  Variable Metric Method for Minimization , 1959, SIAM J. Optim..

[37]  P. Wolfe Convergence Conditions for Ascent Methods. II , 1969 .

[38]  N. Andrei Nonlinear Conjugate Gradient Methods for Unconstrained Optimization , 2020, Springer Optimization and Its Applications.

[39]  Anthony V. Fiacco,et al.  Nonlinear programming;: Sequential unconstrained minimization techniques , 1968 .

[40]  L. Liao,et al.  New Conjugacy Conditions and Related Nonlinear Conjugate Gradient Methods , 2001 .

[41]  Shengwei Yao,et al.  An adaptive three-term conjugate gradient method based on self-scaling memoryless BFGS matrix , 2018, J. Comput. Appl. Math..

[42]  Neculai Andrei,et al.  Acceleration of conjugate gradient algorithms for unconstrained optimization , 2009, Appl. Math. Comput..

[43]  Neculai Andrei,et al.  Accelerated adaptive Perry conjugate gradient algorithms based on the self-scaling memoryless BFGS update , 2017, J. Comput. Appl. Math..

[44]  R. Fletcher,et al.  A New Approach to Variable Metric Algorithms , 1970, Comput. J..