Practical Large-Scale Linear Programming using Primal-Dual Hybrid Gradient

We present PDLP, a practical first-order method for linear programming (LP) that can solve to the high levels of accuracy that are expected in traditional LP applications. In addition, it can scale to very large problems because its core operation is matrix-vector multiplications. PDLP is derived by applying the primal-dual hybrid gradient (PDHG) method, popularized by Chambolle and Pock (2011), to a saddle-point formulation of LP. PDLP enhances PDHG for LP by combining several new techniques with older tricks from the literature; the enhancements include diagonal preconditioning, presolving, adaptive step sizes, and adaptive restarting. PDLP improves the state of the art for first-order methods applied to LP. We compare PDLP with SCS, an ADMM-based solver, on a set of 383 LP instances derived from MIPLIB 2017. With a target of 10−8 relative accuracy and 1 hour time limit, PDLP achieves a 6.3x reduction in the geometric mean of solve times and a 4.6x reduction in the number of instances unsolved (from 227 to 49). Furthermore, we highlight standard benchmark instances and a large-scale application (PageRank) where our open-source prototype of PDLP, written in Julia, outperforms a commercial LP solver.

[1]  Daniel Cremers,et al.  An algorithm for minimizing the Mumford-Shah functional , 2009, 2009 IEEE 12th International Conference on Computer Vision.

[2]  Min Li,et al.  Adaptive Primal-Dual Splitting Methods for Statistical Learning and Image Processing , 2015, NIPS.

[3]  Robert E. Bixby,et al.  Presolve Reductions in Mixed Integer Programming , 2020, INFORMS J. Comput..

[4]  Guanghui Lan,et al.  Primal-dual first-order methods with O (1/e) iteration-complexity for cone programming. , 2011 .

[5]  Antonin Chambolle,et al.  A First-Order Primal-Dual Algorithm for Convex Problems with Applications to Imaging , 2011, Journal of Mathematical Imaging and Vision.

[6]  Thomas Pock,et al.  A First-Order Primal-Dual Algorithm with Linesearch , 2016, SIAM J. Optim..

[7]  Haihao Lu,et al.  Faster First-Order Primal-Dual Methods for Linear Programming using Restarts and Sharpness , 2021 .

[8]  James Renegar,et al.  Accelerated first-order methods for hyperbolic programming , 2015, Mathematical Programming.

[9]  J. Zico Kolter,et al.  OptNet: Differentiable Optimization as a Layer in Neural Networks , 2017, ICML.

[10]  Kinjal Basu,et al.  ECLIPSE: An Extreme-Scale Linear Program Solver for Web-Applications , 2020, ICML.

[11]  Alan Edelman,et al.  Julia: A Fresh Approach to Numerical Computing , 2014, SIAM Rev..

[12]  Alexander Schrijver,et al.  Theory of linear and integer programming , 1986, Wiley-Interscience series in discrete mathematics and optimization.

[13]  Yurii Nesterov,et al.  Linear convergence of first order methods for non-strongly convex optimization , 2015, Math. Program..

[14]  Lieven Vandenberghe,et al.  On the equivalence of the primal-dual hybrid gradient method and Douglas–Rachford splitting , 2018, Math. Program..

[15]  Arkadi Nemirovski,et al.  Prox-Method with Rate of Convergence O(1/t) for Variational Inequalities with Lipschitz Continuous Monotone Operators and Smooth Convex-Concave Saddle Point Problems , 2004, SIAM J. Optim..

[16]  D. Ruiz A Scaling Algorithm to Equilibrate Both Rows and Columns Norms in Matrices 1 , 2001 .

[17]  Yuval Rabani,et al.  Linear Programming , 2007, Handbook of Approximation Algorithms and Metaheuristics.

[18]  Antonin Chambolle,et al.  Diagonal preconditioning for first order primal-dual algorithms in convex optimization , 2011, 2011 International Conference on Computer Vision.

[19]  G. Dantzig Origins of the simplex method , 1990 .

[20]  Amir Beck,et al.  FOM – a MATLAB toolbox of first-order methods for solving convex optimization problems , 2019, Optim. Methods Softw..

[21]  Brendan O'Donoghue,et al.  Operator splitting for a homogeneous embedding of the monotone linear complementarity problem , 2020 .

[22]  Álvaro Veiga,et al.  Exploiting low-rank structure in semidefinite programming by approximate operator splitting , 2018, Optimization.

[23]  I. Maros Computational Techniques of the Simplex Method , 2002 .

[24]  Tony F. Chan,et al.  A General Framework for a Class of First Order Primal-Dual Algorithms for Convex Optimization in Imaging Science , 2010, SIAM J. Imaging Sci..

[25]  Stephen P. Boyd,et al.  Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers , 2011, Found. Trends Mach. Learn..

[26]  Benjamin Müller,et al.  The SCIP Optimization Suite 5.0 , 2017, 2112.08872.

[27]  Stephen P. Boyd,et al.  Conic Optimization via Operator Splitting and Homogeneous Self-Dual Embedding , 2013, Journal of Optimization Theory and Applications.

[28]  Hartwig Anzt,et al.  Sparse Linear Algebra on AMD and NVIDIA GPUs – The Race Is On , 2020, ISC.

[29]  Antonin Chambolle,et al.  Stochastic Primal-Dual Hybrid Gradient Algorithm with Arbitrary Sampling and Imaging Applications , 2017, SIAM J. Optim..

[30]  Albert,et al.  Emergence of scaling in random networks , 1999, Science.

[31]  Ness B. Shroff,et al.  A New Alternating Direction Method for Linear Programming , 2017, NIPS.

[32]  Heinz H. Bauschke,et al.  Convex Analysis and Monotone Operator Theory in Hilbert Spaces , 2011, CMS Books in Mathematics.

[33]  Dimitri P. Bertsekas,et al.  On the Douglas—Rachford splitting method and the proximal point algorithm for maximal monotone operators , 1992, Math. Program..

[34]  Emmanuel J. Candès,et al.  Adaptive Restart for Accelerated Gradient Schemes , 2012, Foundations of Computational Mathematics.

[35]  Emmanuel J. Candès,et al.  Templates for convex cone problems with applications to sparse signal recovery , 2010, Math. Program. Comput..

[36]  B. V. Dean,et al.  Studies in Linear and Non-Linear Programming. , 1959 .

[37]  On the convergence of stochastic primal-dual hybrid gradient , 2019, 1911.00799.

[38]  Kim-Chuan Toh,et al.  An Asymptotically Superlinearly Convergent Semismooth Newton Augmented Lagrangian Method for Linear Programming , 2020, SIAM J. Optim..

[39]  Stephen P. Boyd,et al.  OSQP: an operator splitting solver for quadratic programs , 2017, 2018 UKACC 12th International Conference on Control (CONTROL).

[40]  Mingqiang Zhu,et al.  An Efficient Primal-Dual Hybrid Gradient Algorithm For Total Variation Image Restoration , 2008 .

[41]  Shiqian Ma,et al.  An ADMM-based interior-point method for large-scale linear programming , 2018, Optim. Methods Softw..

[42]  Stephen P. Boyd,et al.  A Primer on Monotone Operator Methods , 2015 .

[43]  Christian Tjandraatmadja,et al.  Solving Mixed Integer Programs Using Neural Networks , 2020, ArXiv.

[44]  Amir Beck,et al.  First-Order Methods in Optimization , 2017 .

[45]  Antonin Chambolle,et al.  On the ergodic convergence rates of a first-order primal–dual algorithm , 2016, Math. Program..

[46]  Yurii Nesterov,et al.  Interior-point polynomial algorithms in convex programming , 1994, Siam studies in applied mathematics.

[47]  Xiaoming Yuan,et al.  Adaptive Primal-Dual Hybrid Gradient Methods for Saddle-Point Problems , 2013, 1305.0546.

[48]  J. Zico Kolter,et al.  A Semismooth Newton Method for Fast, Generic Convex Programming , 2017, ICML.

[49]  Timo Berthold,et al.  MIPLIB 2017: data-driven compilation of the 6th mixed-integer programming library , 2021, Mathematical Programming Computation.

[50]  William Orchard-Hays History of Mathematical Programming Systems , 1984, Annals of the History of Computing.

[51]  Haihao Lu,et al.  Infeasibility Detection with Primal-Dual Hybrid Gradient for Large-Scale Linear Programming , 2021, SIAM J. Optim..

[52]  D. Bertsekas,et al.  An Alternating Direction Method for Linear Programming , 1990 .

[53]  Tianbao Yang,et al.  RSG: Beating Subgradient Method without Smoothness and Strong Convexity , 2015, J. Mach. Learn. Res..

[54]  Javier Peña,et al.  First-Order Algorithm with O(ln(1/e)) Convergence for e-Equilibrium in Two-Person Zero-Sum Games , 2008, AAAI.

[55]  Elmer Earl. Branstetter,et al.  The theory of linear programming , 1963 .

[56]  Bingsheng He,et al.  Convergence Analysis of Primal-Dual Algorithms for a Saddle-Point Problem: From Contraction Perspective , 2012, SIAM J. Imaging Sci..

[57]  Mark Cannon,et al.  COSMO: A conic operator splitting method for large convex problems , 2019, 2019 18th European Control Conference (ECC).

[58]  Y. Nesterov A method for solving the convex programming problem with convergence rate O(1/k^2) , 1983 .

[59]  Jonathan Eckstein,et al.  Efficient Distributed-Memory Parallel Matrix-Vector Multiplication with Wide or Tall Unstructured Sparse Matrices , 2018, ArXiv.

[60]  Stephen P. Boyd,et al.  Proximal Algorithms , 2013, Found. Trends Optim..

[61]  Nesa L'abbe Wu,et al.  Linear programming and extensions , 1981 .

[62]  G. M. Korpelevich The extragradient method for finding saddle points and other problems , 1976 .

[63]  Christopher Fougner,et al.  Parameter Selection and Preconditioning for a Graph Form Solver , 2015, 1503.08366.

[64]  R. Rockafellar Monotone Operators and the Proximal Point Algorithm , 1976 .

[65]  Laurent Condat,et al.  A Primal–Dual Splitting Method for Convex Optimization Involving Lipschitzian, Proximable and Linear Composite Terms , 2012, Journal of Optimization Theory and Applications.

[66]  Yurii Nesterov,et al.  Subgradient methods for huge-scale optimization problems , 2013, Mathematical Programming.

[67]  Stephen P. Boyd,et al.  Convex Optimization , 2004, Algorithms and Theory of Computation Handbook.