Algebraic rules for quadratic regularization of Newton’s method

In this work we propose a class of quasi-Newton methods to minimize a twice differentiable function with Lipschitz continuous Hessian. These methods are based on the quadratic regularization of Newton’s method, with algebraic explicit rules for computing the regularizing parameter. The convergence properties of this class of methods are analysed. We show that if the sequence generated by the algorithm converges then its limit point is stationary. We also establish local quadratic convergence in a neighborhood of a stationary point with positive definite Hessian. Encouraging numerical experiments are presented.

[1]  William W. Hager,et al.  Self-adaptive inexact proximal point methods , 2008, Comput. Optim. Appl..

[2]  S. Goldfeld,et al.  Maximization by Quadratic Hill-Climbing , 1966 .

[3]  Jérôme Malick,et al.  Descentwise inexact proximal algorithms for smooth optimization , 2012, Comput. Optim. Appl..

[4]  Yurii Nesterov,et al.  Cubic regularization of Newton method and its global performance , 2006, Math. Program..

[5]  Bobby Schnabel,et al.  A modular system of algorithms for unconstrained minimization , 1985, TOMS.

[6]  R. Fletcher Practical Methods of Optimization , 1988 .

[7]  Chao Yang,et al.  ARPACK users' guide - solution of large-scale eigenvalue problems with implicitly restarted Arnoldi methods , 1998, Software, environments, tools.

[8]  Nicholas I. M. Gould,et al.  Adaptive cubic regularisation methods for unconstrained optimization. Part II: worst-case function- and derivative-evaluation complexity , 2011, Math. Program..

[9]  B. Martinet Brève communication. Régularisation d'inéquations variationnelles par approximations successives , 1970 .

[10]  Peter Deuflhard,et al.  Affine conjugate adaptive Newton methods for nonlinear elastomechanics , 2007, Optim. Methods Softw..

[11]  A. Tikhonov On the stability of inverse problems , 1943 .

[12]  Kenneth Levenberg A METHOD FOR THE SOLUTION OF CERTAIN NON – LINEAR PROBLEMS IN LEAST SQUARES , 1944 .

[13]  Stephen J. Wright,et al.  Numerical Optimization , 2018, Fundamental Statistical Inference.

[14]  John W. Tukey,et al.  Exploratory data analysis , 1977, Addison-Wesley series in behavioral science : quantitative methods.

[15]  M. D. Hebden,et al.  An algorithm for minimization using exact second derivatives , 1973 .

[16]  Jorge J. Moré,et al.  Recent Developments in Algorithms and Software for Trust Region Methods , 1982, ISMP.

[17]  Ernesto G. Birgin,et al.  2 . 2 Meaning of “ to solve a problem ” , 2011 .

[18]  Philip E. Gill,et al.  Practical optimization , 1981 .

[19]  Nicholas I. M. Gould,et al.  Updating the regularization parameter in the adaptive cubic regularization algorithm , 2012, Comput. Optim. Appl..

[20]  Nicholas I. M. Gould,et al.  Adaptive cubic regularisation methods for unconstrained optimization. Part I: motivation, convergence and numerical results , 2011, Math. Program..

[21]  John W. Tukey,et al.  Exploratory Data Analysis. , 1979 .

[22]  Nicholas I. M. Gould,et al.  Trust Region Methods , 2000, MOS-SIAM Series on Optimization.

[23]  Nicholas I. M. Gould,et al.  Evaluation complexity of adaptive cubic regularization methods for convex unconstrained optimization , 2012, Optim. Methods Softw..

[24]  John E. Dennis,et al.  Numerical methods for unconstrained optimization and nonlinear equations , 1983, Prentice Hall series in computational mathematics.

[25]  Stephen J. Wright,et al.  Numerical Optimization (Springer Series in Operations Research and Financial Engineering) , 2000 .

[26]  P. Toint,et al.  An adaptive cubic regularization algorithm for nonconvex optimization with convex constraints and its function-evaluation complexity , 2012 .