Tight global linear convergence rate bounds for Douglas–Rachford splitting

Recently, several authors have shown local and global convergence rate results for Douglas–Rachford splitting under strong monotonicity, Lipschitz continuity, and cocoercivity assumptions. Most of these focus on the convex optimization setting. In the more general monotone inclusion setting, Lions and Mercier showed a linear convergence rate bound under the assumption that one of the two operators is strongly monotone and Lipschitz continuous. We show that this bound is not tight, meaning that no problem from the considered class converges exactly with that rate. In this paper, we present tight global linear convergence rate bounds for that class of problems. We also provide tight linear convergence rate bounds under the assumptions that one of the operators is strongly monotone and cocoercive, and that one of the operators is strongly monotone and the other is cocoercive. All our linear convergence results are obtained by proving the stronger property that the Douglas–Rachford operator is contractive.

[1]  A. Raghunathan,et al.  ADMM for Convex Quadratic Programs: Linear Convergence and Infeasibility Detection , 2014, 1411.7288.

[2]  D. Russell Luke,et al.  Alternating Projections and Douglas-Rachford for Sparse Affine Feasibility , 2013, IEEE Transactions on Signal Processing.

[3]  Heinz H. Bauschke,et al.  Optimal rates of convergence of matrices with applications , 2014 .

[4]  B. Mercier,et al.  A dual algorithm for the solution of nonlinear variational problems via finite element approximation , 1976 .

[5]  Alberto Bemporad,et al.  Douglas-rachford splitting: Complexity estimates and accelerated variants , 2014, 53rd IEEE Conference on Decision and Control.

[6]  Patrick L. Combettes,et al.  Proximal Splitting Methods in Signal Processing , 2009, Fixed-Point Algorithms for Inverse Problems in Science and Engineering.

[7]  Stephen P. Boyd,et al.  Metric Selection in Douglas-Rachford Splitting and ADMM , 2014 .

[8]  D. Russell Luke,et al.  Nonconvex Notions of Regularity and Convergence of Fundamental Algorithms for Feasibility Problems , 2012, SIAM J. Optim..

[9]  Pontus Giselsson,et al.  Tight linear convergence rate bounds for Douglas-Rachford splitting and ADMM , 2015, 2015 54th IEEE Conference on Decision and Control (CDC).

[10]  Hyunjoong Kim,et al.  Functional Analysis I , 2017 .

[11]  Jonathan Eckstein Splitting methods for monotone operators with applications to parallel optimization , 1989 .

[12]  A. A. Potapenko,et al.  Method of Successive Approximations , 1964, Encyclopedia of Evolutionary Psychological Science.

[13]  Stephen P. Boyd,et al.  Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers , 2011, Found. Trends Mach. Learn..

[14]  Qing Ling,et al.  On the Linear Convergence of the ADMM in Decentralized Consensus Optimization , 2013, IEEE Transactions on Signal Processing.

[15]  Daniel Boley,et al.  Local Linear Convergence of the Alternating Direction Method of Multipliers on Quadratic or Linear Programs , 2013, SIAM J. Optim..

[16]  H. H. Rachford,et al.  On the numerical solution of heat conduction problems in two and three space variables , 1956 .

[17]  W. R. Mann,et al.  Mean value methods in iteration , 1953 .

[18]  Euhanna Ghadimi,et al.  Optimal Parameter Selection for the Alternating Direction Method of Multipliers (ADMM): Quadratic Problems , 2013, IEEE Transactions on Automatic Control.

[19]  Heinz H. Bauschke,et al.  The rate of linear convergence of the Douglas-Rachford algorithm for subspaces is the cosine of the Friedrichs angle , 2013, J. Approx. Theory.

[20]  Heinz H. Bauschke,et al.  Convex Analysis and Monotone Operator Theory in Hilbert Spaces , 2011, CMS Books in Mathematics.

[21]  Hung M. Phan,et al.  Linear convergence of the Douglas–Rachford method for two closed sets , 2014, 1401.6509.

[22]  H. H. Rachford,et al.  The Numerical Solution of Parabolic and Elliptic Differential Equations , 1955 .

[23]  Pascal Bianchi,et al.  Explicit Convergence Rate of a Distributed Alternating Direction Method of Multipliers , 2013, IEEE Transactions on Automatic Control.

[24]  Bingsheng He,et al.  On the O(1/n) Convergence Rate of the Douglas-Rachford Alternating Direction Method , 2012, SIAM J. Numer. Anal..

[25]  P. Lions,et al.  Splitting Algorithms for the Sum of Two Nonlinear Operators , 1979 .

[26]  Laurent Demanet,et al.  Eventual linear convergence of the Douglas-Rachford iteration for basis pursuit , 2013, Math. Comput..

[27]  Heinz H. Bauschke,et al.  Optimal Rates of Linear Convergence of Relaxed Alternating Projections and Generalized Douglas-Rachford Methods for Two Subspaces , 2015, Numerical Algorithms.

[28]  Zhi-Quan Luo,et al.  On the linear convergence of the alternating direction method of multipliers , 2012, Mathematical Programming.

[29]  Simon Setzer,et al.  Split Bregman Algorithm, Douglas-Rachford Splitting and Frame Shrinkage , 2009, SSVM.

[30]  Damek Davis,et al.  Convergence Rate Analysis of Several Splitting Schemes , 2014, 1406.4834.

[31]  P. L. Combettes,et al.  Compositions and convex combinations of averaged nonexpansive operators , 2014, 1407.5100.

[32]  Wotao Yin,et al.  Faster Convergence Rates of Relaxed Peaceman-Rachford and ADMM Under Regularity Assumptions , 2014, Math. Oper. Res..

[33]  G. Minty Monotone (nonlinear) operators in Hilbert space , 1962 .

[34]  Wotao Yin,et al.  On the Global and Linear Convergence of the Generalized Alternating Direction Method of Multipliers , 2016, J. Sci. Comput..

[35]  Michael I. Jordan,et al.  A General Analysis of the Convergence of ADMM , 2015, ICML.

[36]  Stephen P. Boyd,et al.  Diagonal scaling in Douglas-Rachford splitting and ADMM , 2014, 53rd IEEE Conference on Decision and Control.