A semismooth Newton method for adaptive distributed sparse linear regression

The presented work studies an application of a technique known as a semismooth Newton (SSN) method to accelerate the convergence of distributed quadratic programming LASSO (DQP-LASSO) - a consensus-based distributed sparse linear regression algorithm. The DQP-LASSO algorithm exploits an alternating directions method of multipliers (ADMM) algorithm to reduce a global LASSO problem to a series of local (per agent) LASSO optimizations, which outcomes are then appropriately combined. The SSN algorithm enjoys superlinear convergence and thus permits implementing these local optimizations more efficiently. Yet in some cases SSN might experience convergence issues. Here it is shown that the ADMM-inherent regularization also provides sufficient regularization to stabilize the SSN algorithm, thus ensuring a stable convergence of the whole scheme. Additionally, the structure of the SSN algorithm also permits an adaptive implementation of a distributed sparse regression. This allows for an estimation of time-varying sparse vectors, as well as leverages storage requirements for processing streams of data.

[1]  Thomas B. Schön,et al.  2015 IEEE 6th International Workshop on Computational Advances in Multi-Sensor Adaptive Processing, CAMSAP 2015 , 2016 .

[2]  Reza Olfati-Saber,et al.  Consensus and Cooperation in Networked Multi-Agent Systems , 2007, Proceedings of the IEEE.

[3]  Stephen P. Boyd,et al.  Proximal Algorithms , 2013, Found. Trends Optim..

[4]  R. Tibshirani Regression Shrinkage and Selection via the Lasso , 1996 .

[5]  Marc Teboulle,et al.  A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems , 2009, SIAM J. Imaging Sci..

[6]  Robert H. Halstead,et al.  Matrix Computations , 2011, Encyclopedia of Parallel Computing.

[7]  K. Bredies,et al.  Linear Convergence of Iterative Soft-Thresholding , 2007, 0709.1598.

[8]  Xiaojun Chen,et al.  Smoothing Methods and Semismooth Methods for Nondifferentiable Operator Equations , 2000, SIAM J. Numer. Anal..

[9]  Stephen P. Boyd,et al.  Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers , 2011, Found. Trends Mach. Learn..

[10]  Ender M. Eksioglu,et al.  RLS adaptive filtering with sparsity regularization , 2010, 10th International Conference on Information Science, Signal Processing and their Applications (ISSPA 2010).

[11]  D. Lorenz,et al.  A semismooth Newton method for Tikhonov functionals with sparsity constraints , 2007, 0709.3186.

[12]  D. Lorenz,et al.  An active set approach to the elastic-net and its applications in mass spectrometry , 2009 .

[13]  Stephen P. Boyd,et al.  An Interior-Point Method for Large-Scale $\ell_1$-Regularized Least Squares , 2007, IEEE Journal of Selected Topics in Signal Processing.

[14]  Mário A. T. Figueiredo,et al.  Gradient Projection for Sparse Reconstruction: Application to Compressed Sensing and Other Inverse Problems , 2007, IEEE Journal of Selected Topics in Signal Processing.

[15]  Gene H. Golub,et al.  Matrix computations (3rd ed.) , 1996 .

[16]  I. Daubechies,et al.  An iterative thresholding algorithm for linear inverse problems with a sparsity constraint , 2003, math/0307152.

[17]  Gonzalo Mateos,et al.  Distributed Sparse Linear Regression , 2010, IEEE Transactions on Signal Processing.