GLOBALIZING STABILIZED SQP BY SMOOTH PRIMAL-DUAL EXACT PENALTY FUNCTION

An iteration of the stabilized sequential quadratic programming method (sSQP) consists in solving a certain quadratic program in the primal-dual space, regularized in the dual variables. The advantage with respect to the classical sequential quadratic programming (SQP) is that no constraint qualifications are required for fast local convergence (i.e., the problem can be degenerate). In particular, for equality-constrained problems the superlinear rate of convergence is guaranteed under the only assumption that the primal-dual starting point is close enough to a stationary point and a noncritical Lagrange multiplier pair (the latter being weaker than the second-order sufficient optimality condition). However, unlike for SQP, designing natural globally convergent algorithms based on the sSQP idea proved quite a challenge and, currently, there are very few proposals in this direction. For equalityconstrained problems, we suggest to use for the task linesearch for the smooth two-parameter exact penalty function, which is the sum of the Lagrangian with squared penalizations of the violation of the constraints and of the violation of the Lagrangian stationarity with respect to primal variables. Reasonable global convergence properties are established. Moreover, we show that the globalized algorithm preserves the superlinear rate of sSQP under the weak conditions mentioned above. We also present some numerical experience on a set of degenerate test problems.

[1]  Daniel P. Robinson,et al.  A REGULARIZED SQP METHOD WITH CONVERGENCE TO SECOND-ORDER OPTIMAL POINTS , 2013 .

[2]  Alexey F. Izmailov,et al.  Combining stabilized SQP with the augmented Lagrangian algorithm , 2015, Comput. Optim. Appl..

[3]  S. Glad Properties of updating methods for the multipliers in augmented Lagrangians , 1979 .

[4]  Alexey F. Izmailov,et al.  Examples of dual behaviour of Newton-type methods on optimization problems with degenerate constraints , 2009, Comput. Optim. Appl..

[5]  Daniel P. Robinson,et al.  A primal-dual augmented Lagrangian , 2010, Computational Optimization and Applications.

[6]  Andreas Fischer,et al.  Local behavior of an iterative framework for generalized equations with nonisolated solutions , 2002, Math. Program..

[7]  Daniel P. Robinson,et al.  A Globally Convergent Stabilized SQP Method , 2013, SIAM J. Optim..

[8]  Stephen J. Wright Modifying SQP for Degenerate Problems , 2002, SIAM J. Optim..

[9]  Alexey F. Izmailov,et al.  On the analytical and numerical stability of critical Lagrange multipliers , 2005 .

[10]  Stephen J. Wright Superlinear Convergence of a Stabilized SQP Method to a Degenerate Solution , 1998, Comput. Optim. Appl..

[11]  Alexey F. Izmailov,et al.  Global Convergence of Augmented Lagrangian Methods Applied to Optimization Problems with Degenerate Constraints, Including Problems with Complementarity Constraints , 2012, SIAM J. Optim..

[12]  Alexey F. Izmailov,et al.  On attraction of Newton-type iterates to multipliers violating second-order sufficiency conditions , 2009, Math. Program..

[13]  Alexey F. Izmailov,et al.  On attraction of linearly constrained Lagrangian methods and of stabilized and quasi-Newton SQP methods to critical multipliers , 2011, Math. Program..

[14]  Paulo J. S. Silva,et al.  A relaxed constant positive linear dependence constraint qualification and applications , 2011, Mathematical Programming.

[15]  Stephen J. Wright,et al.  Numerical Behavior of a Stabilized SQP Method for Degenerate NLP Problems , 2002, COCOS.

[16]  L. Grippo,et al.  A nonmonotone line search technique for Newton's method , 1986 .

[17]  Alexey F. Izmailov,et al.  Attraction of Newton method to critical Lagrange multipliers: fully quadratic case , 2015, Math. Program..

[18]  Daniel P. Robinson,et al.  A GLOBALLY CONVERGENT STABILIZED SQP METHOD : SUPERLINEAR CONVERGENCE , 2014 .

[19]  Damián R. Fernández,et al.  An inexact restoration strategy for the globalization of the sSQP method , 2012, Computational Optimization and Applications.

[20]  José Luis Morales,et al.  A numerical study of limited memory BFGS methods , 2002, Appl. Math. Lett..

[21]  Alexey F. Izmailov,et al.  Abstract Newtonian Frameworks and Their Applications , 2013, SIAM J. Optim..

[22]  L. Grippo,et al.  A New Class of Augmented Lagrangians in Nonlinear Programming , 1979 .

[23]  William W. Hager,et al.  Stabilized Sequential Quadratic Programming , 1999, Comput. Optim. Appl..

[24]  Jean Charles Gilbert,et al.  Numerical Optimization: Theoretical and Practical Aspects , 2003 .

[25]  Jorge J. Moré,et al.  Benchmarking optimization software with performance profiles , 2001, Math. Program..

[26]  D. Bertsekas Enlarging the region of convergence of Newton's method for constrained optimization , 1982 .

[27]  A. F. Izmailov,et al.  Newton-Type Methods for Optimization and Variational Problems , 2014 .

[28]  Dimitri P. Bertsekas,et al.  Constrained Optimization and Lagrange Multiplier Methods , 1982 .

[29]  A. F. Izmailov,et al.  Critical Lagrange multipliers: what we currently know about them, how they spoil our lives, and what we can do about it , 2015 .

[30]  Alexey F. Izmailov,et al.  Stabilized SQP revisited , 2012, Math. Program..

[31]  Mikhail V. Solodov,et al.  Stabilized sequential quadratic programming for optimization and a stabilized Newton-type method for variational problems , 2010, Math. Program..