Necessary and Sufficient Conditions of Solution Uniqueness in 1-Norm Minimization

This paper shows that the solutions to various 1-norm minimization problems are unique if, and only if, a common set of conditions are satisfied. This result applies broadly to the basis pursuit model, basis pursuit denoising model, Lasso model, as well as certain other 1-norm related models. This condition is previously known to be sufficient for the basis pursuit model to have a unique solution. Indeed, it is also necessary, and applies to a variety of 1-norm related models. The paper also discusses ways to recognize unique solutions and verify the uniqueness conditions numerically. The proof technique is based on linear programming strong duality and strict complementarity results.

[1]  Joel A. Tropp,et al.  Recovery of short, complex linear combinations via /spl lscr//sub 1/ minimization , 2005, IEEE Transactions on Information Theory.

[2]  Michael Elad,et al.  A generalized uncertainty principle and sparse representation in pairs of bases , 2002, IEEE Trans. Inf. Theory.

[3]  Emmanuel J. Candès,et al.  Simple Bounds for Low-complexity Model Reconstruction , 2011, ArXiv.

[4]  R. Tibshirani,et al.  Least angle regression , 2004, math/0406456.

[5]  Yuval Rabani,et al.  Linear Programming , 2007, Handbook of Approximation Algorithms and Metaheuristics.

[6]  Yinyu Ye,et al.  Convergence behavior of interior-point algorithms , 1993, Math. Program..

[7]  O. Scherzer,et al.  Necessary and sufficient conditions for linear convergence of ℓ1‐regularization , 2011 .

[8]  S. Frick,et al.  Compressed Sensing , 2014, Computer Vision, A Reference Guide.

[9]  Michael Elad,et al.  Optimally sparse representation in general (nonorthogonal) dictionaries via ℓ1 minimization , 2003, Proceedings of the National Academy of Sciences of the United States of America.

[10]  M. Best,et al.  Sensitivity Analysis for Mean-Variance Portfolio Problems , 1991 .

[11]  Emmanuel J. Candès,et al.  Simple bounds for recovering low-complexity models , 2011, Math. Program..

[12]  Michael Elad,et al.  From Sparse Solutions of Systems of Equations to Sparse Modeling of Signals and Images , 2009, SIAM Rev..

[13]  Yin Zhang,et al.  Theory of Compressive Sensing via ℓ1-Minimization: a Non-RIP Analysis and Extensions , 2013 .

[14]  Emmanuel J. Candès,et al.  Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information , 2004, IEEE Transactions on Information Theory.

[15]  C. Dossal A necessary and sufficient condition for exact recovery by l1 minimization. , 2012 .

[16]  R. Tibshirani The Lasso Problem and Uniqueness , 2012, 1206.0313.

[17]  Dirk A. Lorenz,et al.  Constructing Test Instances for Basis Pursuit Denoising , 2011, IEEE Transactions on Signal Processing.

[18]  Emmanuel J. Candès,et al.  A Probabilistic and RIPless Theory of Compressed Sensing , 2010, IEEE Transactions on Information Theory.

[19]  R. Tibshirani Regression Shrinkage and Selection via the Lasso , 1996 .

[20]  Xiaoming Huo,et al.  Uncertainty principles and ideal atomic decomposition , 2001, IEEE Trans. Inf. Theory.

[21]  Nesa L'abbe Wu,et al.  Linear programming and extensions , 1981 .

[22]  Jean-Jacques Fuchs,et al.  Recovery of exact sparse representations in the presence of bounded noise , 2005, IEEE Transactions on Information Theory.

[23]  R. DeVore,et al.  Compressed sensing and best k-term approximation , 2008 .

[24]  Emmanuel J. Candès,et al.  Decoding by linear programming , 2005, IEEE Transactions on Information Theory.

[25]  Jean-Jacques Fuchs,et al.  On sparse representations in arbitrary redundant bases , 2004, IEEE Transactions on Information Theory.

[26]  Michael A. Saunders,et al.  Atomic Decomposition by Basis Pursuit , 1998, SIAM J. Sci. Comput..