On the Nonergodic Convergence Rate of an Inexact Augmented Lagrangian Framework for Composite Convex Programming

In this paper, we consider the linearly constrained composite convex optimization problem, whose objective is a sum of a smooth function and a possibly nonsmooth function. We propose an inexact augmented Lagrangian (IAL) framework for solving the problem. The stopping criterion used in solving the augmented Lagrangian (AL) subproblem in the proposed IAL framework is weaker and potentially much easier to check than the one used in most of the existing IAL frameworks/methods. We analyze the global convergence and the non-ergodic convergence rate of the proposed IAL framework.

[1]  Marc Teboulle,et al.  A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems , 2009, SIAM J. Imaging Sci..

[2]  Ion Necoara,et al.  Complexity of first-order inexact Lagrangian and penalty methods for conic convex programming , 2015, Optim. Methods Softw..

[3]  Wotao Yin,et al.  Bregman Iterative Algorithms for (cid:2) 1 -Minimization with Applications to Compressed Sensing ∗ , 2008 .

[4]  Tom Goldstein,et al.  The Split Bregman Method for L1-Regularized Problems , 2009, SIAM J. Imaging Sci..

[5]  Yi Zhou,et al.  Conditional Gradient Sliding for Convex Optimization , 2016, SIAM J. Optim..

[6]  Yurii Nesterov,et al.  Gradient methods for minimizing composite functions , 2012, Mathematical Programming.

[7]  Ion Necoara,et al.  Computational Complexity of Inexact Gradient Augmented Lagrangian Methods: Application to Constrained MPC , 2013, SIAM J. Control. Optim..

[8]  Yin Zhang,et al.  An efficient augmented Lagrangian method with applications to total variation minimization , 2013, Computational Optimization and Applications.

[9]  Ya-Xiang Yuan,et al.  Analysis on a superlinearly convergent augmented Lagrangian method , 2014 .

[10]  Martin Jaggi,et al.  Revisiting Frank-Wolfe: Projection-Free Sparse Convex Optimization , 2013, ICML.

[11]  Zaïd Harchaoui,et al.  Conditional gradient algorithms for norm-regularized smooth convex optimization , 2013, Math. Program..

[12]  Marc Teboulle,et al.  Convergence Rate Analysis of Nonquadratic Proximal Methods for Convex and Linear Programming , 1995, Math. Oper. Res..

[13]  John N. Tsitsiklis,et al.  Gradient Convergence in Gradient methods with Errors , 1999, SIAM J. Optim..

[14]  Wotao Yin,et al.  An Iterative Regularization Method for Total Variation-Based Image Restoration , 2005, Multiscale Model. Simul..

[15]  Z.-Q. Luo,et al.  Error bounds and convergence analysis of feasible descent methods: a general approach , 1993, Ann. Oper. Res..

[16]  Y. Nesterov A method for solving the convex programming problem with convergence rate O(1/k^2) , 1983 .

[17]  Yurii Nesterov,et al.  Complexity bounds for primal-dual methods minimizing the model of objective function , 2017, Mathematical Programming.

[18]  Kim-Chuan Toh,et al.  A Highly Efficient Semismooth Newton Augmented Lagrangian Method for Solving Lasso Problems , 2016, SIAM J. Optim..

[19]  Guanghui Lan,et al.  Gradient sliding for composite optimization , 2014, Mathematical Programming.

[20]  Ion Necoara,et al.  Iteration complexity analysis of dual first-order methods for conic convex programming , 2014, Optim. Methods Softw..

[21]  Philip Wolfe,et al.  An algorithm for quadratic programming , 1956 .

[22]  Yurii Nesterov,et al.  First-order methods of smooth convex optimization with inexact oracle , 2013, Mathematical Programming.

[23]  M. J. D. Powell,et al.  A method for nonlinear constraints in minimization problems , 1969 .

[24]  Dimitri P. Bertsekas,et al.  Nonlinear Programming , 1997 .

[25]  R. Tyrrell Rockafellar,et al.  Augmented Lagrangians and Applications of the Proximal Point Algorithm in Convex Programming , 1976, Math. Oper. Res..

[26]  Ion Necoara,et al.  Complexity certifications of first order inexact Lagrangian and penalty methods for conic convex programming , 2015 .

[27]  John Wright,et al.  Scalable Robust Matrix Recovery: Frank-Wolfe Meets Proximal Methods , 2014, SIAM J. Sci. Comput..

[28]  Dimitri P. Bertsekas,et al.  Constrained Optimization and Lagrange Multiplier Methods , 1982 .

[29]  R. Tapia Newton's Method for Problems with Equality Constraints , 1974 .

[30]  Anthony Man-Cho So,et al.  Non-asymptotic convergence analysis of inexact gradient methods for machine learning without strong convexity , 2013, Optim. Methods Softw..

[31]  Wotao Yin,et al.  Error Forgetting of Bregman Iteration , 2013, J. Sci. Comput..

[32]  Renato D. C. Monteiro,et al.  Iteration-complexity of first-order augmented Lagrangian methods for convex programming , 2015, Mathematical Programming.

[33]  Yurii Nesterov,et al.  Smooth minimization of non-smooth functions , 2005, Math. Program..

[34]  Paulo J. S. Silva,et al.  A practical relative error criterion for augmented Lagrangians , 2012, Mathematical Programming.

[35]  R. Rockafellar The multiplier method of Hestenes and Powell applied to convex programming , 1973 .

[36]  M. Hestenes Multiplier and gradient methods , 1969 .

[37]  Kim-Chuan Toh,et al.  SDPNAL$$+$$+: a majorized semismooth Newton-CG augmented Lagrangian method for semidefinite programming with nonnegative constraints , 2014, Math. Program. Comput..

[38]  Martin Jaggi,et al.  Sparse Convex Optimization Methods for Machine Learning , 2011 .