The direct extension of ADMM for three-block separable convex minimization models is convergent when one function is strongly convex

The alternating direction method of multipliers (ADMM) is a benchmark for solving a two-block linearly constrained convex minimization model whose objective function is the sum of two functions without coupled variables. Meanwhile, it is known that the convergence is not guaranteed if the ADMM is directly extended to a multiple-block convex minimization model whose objective function has more than two functions. Recently, some authors have actively studied the strong convexity condition on the objective function to sufficiently ensure the convergence of the direct extension of ADMM or the resulting convergence when the original scheme is appropriately twisted. However, these strong convexity conditions still seem too strict to be satisfied by some applications for which the direct extension of ADMM work well; and the twisted schemes are less efficient or convenient to implement than the original scheme of the direct extension of ADMM. We are thus motivated to understand why the original scheme of the direct extension of ADMM works for some applications and under which realistic conditions its convergence can be guaranteed. We answer this question for the three-block case where there are three separable functions in the objective; and show that when one of them is strongly convex, the direct extension of ADMM is convergent. Note that the strong convexity of one function does hold for many applications. We further estimate the worst-case convergence rate measured by the iteration complexity in both the ergodic and nonergodic senses for the direct extension of ADMM, and show that its globally linear convergence in asymptotical sense can be guaranteed under some additional conditions.

[1]  J. Moreau Proximité et dualité dans un espace hilbertien , 1965 .

[2]  M. Powell A method for nonlinear constraints in minimization problems , 1969 .

[3]  M. Hestenes Multiplier and gradient methods , 1969 .

[4]  B. Martinet,et al.  R'egularisation d''in'equations variationnelles par approximations successives , 1970 .

[5]  R. Glowinski,et al.  Sur l'approximation, par éléments finis d'ordre un, et la résolution, par pénalisation-dualité d'une classe de problèmes de Dirichlet non linéaires , 1975 .

[6]  R. Rockafellar Monotone Operators and the Proximal Point Algorithm , 1976 .

[7]  丸山 徹 Convex Analysisの二,三の進展について , 1977 .

[8]  R. Glowinski,et al.  Numerical Methods for Nonlinear Variational Problems , 1985 .

[9]  G. McLachlan Discriminant Analysis and Statistical Pattern Recognition , 1992 .

[10]  John Wright,et al.  RASL: Robust alignment by sparse and low-rank decomposition for linearly correlated images , 2010, 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[11]  Antonin Chambolle,et al.  A First-Order Primal-Dual Algorithm for Convex Problems with Applications to Imaging , 2011, Journal of Mathematical Imaging and Vision.

[12]  Pablo A. Parrilo,et al.  Latent variable graphical model selection via convex optimization , 2010, 2010 48th Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[13]  Xiaodong Li,et al.  Stable Principal Component Pursuit , 2010, 2010 IEEE International Symposium on Information Theory.

[14]  Stephen P. Boyd,et al.  Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers , 2011, Found. Trends Mach. Learn..

[15]  Xiaoming Yuan,et al.  Recovering Low-Rank and Sparse Components of Matrices from Incomplete and Noisy Observations , 2011, SIAM J. Optim..

[16]  Xiaoming Yuan,et al.  Convergence rate and iteration complexity on the alternating direction method of multipliers with a substitution procedure for separable convex programming , 2012 .

[17]  Xiaoming Yuan,et al.  A Note on the Alternating Direction Method of Multipliers , 2012, J. Optim. Theory Appl..

[18]  Bingsheng He,et al.  Linearized Alternating Direction Method with Gaussian Back Substitution for Separable Convex Programming , 2011 .

[19]  Jonathan Eckstein Augmented Lagrangian and Alternating Direction Methods for Convex Optimization: A Tutorial and Some Illustrative Computational Results , 2012 .

[20]  Daniel Boley,et al.  Local Linear Convergence of ADMM on Quadratic or Linear Programs , 2012 .

[21]  Caihua Chen,et al.  On the Convergence Analysis of the Alternating Direction Method of Multipliers with Three Blocks , 2013 .

[22]  Xiaoming Yuan,et al.  Local Linear Convergence of the Alternating Direction Method of Multipliers for Quadratic Programs , 2013, SIAM J. Numer. Anal..

[23]  Su-In Lee,et al.  Node-based learning of multiple Gaussian graphical models , 2013, J. Mach. Learn. Res..

[24]  Tianyi Lin,et al.  On the Convergence Rate of Multi-Block ADMM , 2014 .

[25]  Xiaoming Yuan,et al.  A Generalized Proximal Point Algorithm and Its Convergence Rate , 2014, SIAM J. Optim..

[26]  Roland Glowinski,et al.  On Alternating Direction Methods of Multipliers: A Historical Perspective , 2014, Modeling, Simulation and Optimization for Science and Technology.

[27]  Kim-Chuan Toh,et al.  A Convergent 3-Block Semi-Proximal ADMM for Convex Minimization Problems with One Strongly Convex Block , 2014, Asia Pac. J. Oper. Res..

[28]  Shiqian Ma,et al.  On the Global Linear Convergence of the ADMM with MultiBlock Variables , 2014, SIAM J. Optim..

[29]  Xiaoming Yuan,et al.  A splitting method for separable convex programming , 2015 .

[30]  Wotao Yin,et al.  On the Global and Linear Convergence of the Generalized Alternating Direction Method of Multipliers , 2016, J. Sci. Comput..

[31]  Bingsheng He,et al.  The direct extension of ADMM for multi-block convex minimization problems is not necessarily convergent , 2014, Mathematical Programming.