Iteratively reweighted least squares and slime mold dynamics: connection and convergence

We present a connection between two dynamical systems arising in entirely different contexts: the Iteratively Reweighted Least Squares (IRLS) algorithm used in compressed sensing and sparse recovery to find a minimum 1-norm solution in an affine space, and the dynamics of a slime mold (Physarum polycephalum) that finds the shortest path in a maze. We elucidate this connection by presenting a new dynamical system –Meta-Algorithm – and showing that the IRLS algorithms and the slime mold dynamics can both be obtained by specializing it to disjoint sets of variables. Subsequently, and building on work on slime mold dynamics for finding shortest paths, we prove convergence and obtain complexity bounds for the Meta-Algorithm that can be viewed as a “damped” version of the IRLS algorithm. A consequence of this latter result is a slime mold dynamics to solve the undirected transshipment problem that computes a (1 + ε)−approximate solution in time polynomial in the size of the input graph, maximum edge cost, and 1 ε – a problem that was left open by the work of (Bonifaci V et al. [10] Physarum can compute shortest paths. Kyoto, Japan, pp. 233–240).

[1]  Shang-Hua Teng,et al.  Nearly-linear time algorithms for graph partitioning, graph sparsification, and solving linear systems , 2003, STOC '04.

[2]  Kurt Mehlhorn,et al.  Physarum can compute shortest paths , 2011, SODA.

[3]  Andrew V. Goldberg,et al.  Beyond the flow decomposition barrier , 1998, JACM.

[4]  Kurt Mehlhorn,et al.  Convergence of the Non-Uniform Directed Physarum Model , 2019, Theor. Comput. Sci..

[5]  Nisheeth K. Vishnoi,et al.  On a Natural Dynamics for Linear Programming , 2015, ITCS.

[6]  M. Putti,et al.  Numerical Solution of Monge-Kantorovich Equations via a dynamic formulation , 2017, 1709.06765.

[7]  Bernard Chazelle,et al.  Natural algorithms and influence systems , 2012, CACM.

[8]  Luca Cardelli,et al.  The Cell Cycle Switch Computes Approximate Majority , 2012, Scientific Reports.

[9]  Jack Brimberg,et al.  Global Convergence of a Generalized Iterative Procedure for the Minisum Location Problem with lp Distances , 1993, Oper. Res..

[10]  Nisheeth K. Vishnoi The Speed of Evolution , 2015, SODA.

[11]  Yin Tat Lee,et al.  An homotopy method for lp regression provably beyond self-concordance and in input-sparsity time , 2018, STOC.

[12]  Shang-Hua Teng,et al.  The Laplacian Paradigm: Emerging Algorithms for Massive Graphs , 2010, TAMC.

[13]  Robert E. Tarjan,et al.  Network Flow and Testing Graph Connectivity , 1975, SIAM J. Comput..

[14]  Xiaoming Huo,et al.  Uncertainty principles and ideal atomic decomposition , 2001, IEEE Trans. Inf. Theory.

[15]  Emmanuel J. Candès,et al.  Decoding by linear programming , 2005, IEEE Transactions on Information Theory.

[16]  Noga Alon,et al.  A Biological Solution to a Fundamental Distributed Computing Problem , 2011, Science.

[17]  Richard M. Karp,et al.  A n^5/2 Algorithm for Maximum Matchings in Bipartite Graphs , 1971, SWAT.

[18]  Jonah Sherman,et al.  Nearly Maximum Flows in Nearly Linear Time , 2013, 2013 IEEE 54th Annual Symposium on Foundations of Computer Science.

[19]  L. Perko Differential Equations and Dynamical Systems , 1991 .

[20]  Prateek Jain,et al.  Globally-convergent Iteratively Reweighted Least Squares for Robust Regression Problems , 2019, AISTATS.

[21]  Deborah M. Gordon,et al.  Ant Encounters: Interaction Networks and Colony Behavior , 2010 .

[22]  Yin Tat Lee,et al.  Path Finding Methods for Linear Programming: Solving Linear Programs in Õ(vrank) Iterations and Faster Algorithms for Maximum Flow , 2014, 2014 IEEE 55th Annual Symposium on Foundations of Computer Science.

[23]  I. Daubechies,et al.  Iteratively reweighted least squares minimization for sparse recovery , 2008, 0807.0575.

[24]  J. Haladyn Comb , 2019, Duchamp, Aesthetics and Capitalism.

[25]  Daniel A. Spielman,et al.  Faster approximate lossy generalized flow via interior point algorithms , 2008, STOC.

[26]  Liming Yang,et al.  Iteratively reweighted least squares for robust regression via SVM and ELM , 2019, ArXiv.

[27]  Junzhou Huang,et al.  Fast iteratively reweighted least squares algorithms for analysis‐based sparse reconstruction , 2018, Medical Image Anal..

[28]  Stephen J. Wright Primal-Dual Interior-Point Methods , 1997, Other Titles in Applied Mathematics.

[29]  Kurt Mehlhorn,et al.  Two Results on Slime Mold Computations , 2019, Theor. Comput. Sci..

[30]  M. Sampson,et al.  Medical students’ perception of lesbian, gay, bisexual, and transgender (LGBT) discrimination in their learning environment and their self-reported comfort level for caring for LGBT patients: a survey study , 2017, Medical education online.

[31]  Nisheeth K. Vishnoi,et al.  Natural Algorithms for Flow Problems , 2016, SODA.

[32]  Daniel A. Spielman Algorithms, Graph Theory, and the Solution of Laplacian Linear Equations , 2012, ICALP.

[33]  Nisheeth K. Vishnoi,et al.  Approximating the exponential, the lanczos method and an Õ(m)-time spectral algorithm for balanced separator , 2011, STOC '12.

[34]  J. McClellan,et al.  Complex Chebyshev approximation for FIR filter design , 1995 .

[35]  Adrian Vladu,et al.  Improved Convergence for and 1 Regression via Iteratively Reweighted Least Squares , 2019 .

[36]  Emmanuel J. Candès,et al.  Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information , 2004, IEEE Transactions on Information Theory.

[37]  A. Tero,et al.  A mathematical model for adaptive transport network in path finding by true slime mold. , 2007, Journal of theoretical biology.

[38]  Kurt Mehlhorn,et al.  Physarum Can Compute Shortest Paths: Convergence Proofs and Complexity Bounds , 2013, ICALP.

[39]  S. Thomas Alexander,et al.  A relationship between the recursive least squares update and homotopy continuation methods , 1991, IEEE Trans. Signal Process..

[40]  M. R. Osborne Finite Algorithms in Optimization and Data Analysis , 1985 .

[41]  Bhaskar D. Rao,et al.  Sparse signal reconstruction from limited data using FOCUSS: a re-weighted minimum norm algorithm , 1997, IEEE Trans. Signal Process..

[42]  Richard Peng,et al.  Iterative Refinement for ℓp-norm Regression , 2019, SODA.

[43]  C. Burrus Iterative Reweighted Least Squares ∗ , 2014 .

[44]  Amir Beck,et al.  On the Convergence of Alternating Minimization for Convex Programming with Applications to Iteratively Reweighted Least Squares and Decomposition Schemes , 2015, SIAM J. Optim..

[45]  Kurt Mehlhorn,et al.  Convergence of the Non-Uniform Physarum Dynamics , 2019, Theor. Comput. Sci..

[46]  Richard Peng,et al.  Fast, Provably convergent IRLS Algorithm for p-norm Linear Regression , 2019, NeurIPS.

[47]  Nisheeth K. Vishnoi,et al.  Lx = b , 2013, Found. Trends Theor. Comput. Sci..

[48]  Wotao Yin,et al.  Iteratively reweighted algorithms for compressive sensing , 2008, 2008 IEEE International Conference on Acoustics, Speech and Signal Processing.

[49]  Michael Elad,et al.  Optimally sparse representation in general (nonorthogonal) dictionaries via ℓ1 minimization , 2003, Proceedings of the National Academy of Sciences of the United States of America.

[50]  Leslie G. Valiant,et al.  Evolvability , 2009, JACM.

[51]  Yurii Nesterov,et al.  Interior-point polynomial algorithms in convex programming , 1994, Siam studies in applied mathematics.

[52]  D. R. Fulkerson,et al.  Maximal Flow Through a Network , 1956 .

[53]  T. Nakagaki,et al.  Intelligence: Maze-solving by an amoeboid organism , 2000, Nature.

[54]  Enrico Facca,et al.  Towards a Stationary Monge-Kantorovich Dynamics: The Physarum Polycephalum Experience , 2016, SIAM J. Appl. Math..

[55]  P. Green Iteratively reweighted least squares for maximum likelihood estimation , 1984 .

[56]  A. E. Eiben,et al.  From evolutionary computation to the evolution of things , 2015, Nature.

[57]  James Y. Zou,et al.  A Slime Mold Solver for Linear Programming Problems , 2012, CiE.

[58]  C. Sidney Burrus,et al.  Iterative reweighted least-squares design of FIR filters , 1994, IEEE Trans. Signal Process..

[59]  Emery N. Brown,et al.  Convergence and Stability of Iteratively Re-weighted Least Squares Algorithms , 2014, IEEE Transactions on Signal Processing.

[60]  L. Karlovitz,et al.  Construction of nearest points in the Lp, p even, and L∞ norms. I , 1970 .

[61]  Geoffrey E. Hinton,et al.  Deep Learning , 2015, Nature.

[62]  László A. Végh A Strongly Polynomial Algorithm for a Class of Minimum-Cost Flow Problems with Separable Convex Objectives , 2016, SIAM J. Comput..

[63]  Bhaskar D. Rao,et al.  An affine scaling methodology for best basis selection , 1999, IEEE Trans. Signal Process..

[64]  Nisheeth K. Vishnoi,et al.  IRLS and Slime Mold: Equivalence and Convergence , 2016, ArXiv.

[65]  Nicolai Bissantz,et al.  Convergence Analysis of Generalized Iteratively Reweighted Least Squares Algorithms on Convex Function Spaces , 2008, SIAM J. Optim..

[66]  Umesh Vazirani,et al.  Algorithms, games, and evolution , 2014, Proceedings of the National Academy of Sciences.

[67]  László A. Végh,et al.  A simpler and faster strongly polynomial algorithm for generalized flow maximization , 2017, STOC.