Necessary and sufficient conditions of solution uniqueness in ℓ1 minimization

—This paper shows that the solutions to various convex (cid:96) 1 minimization problems are unique if and only if a common set of conditions are satisfied. This result applies broadly to the basis pursuit model, basis pursuit denoising model, Lasso model, as well as other (cid:96) 1 models that either minimize f ( Ax − b ) or impose the constraint f ( Ax − b ) ≤ σ , where f is a strictly convex function. For these models, this paper proves that, given a solution x ∗ and defining I = supp( x ∗ ) and s = sign( x ∗ I ) , x ∗ is the unique solution if and only if A I has full column rank and there exists y such that A TI y = s and | a Ti y | ∞ < 1 for i (cid:54)∈ I . This condition is previously known to be sufficient for the basis pursuit model to have a unique solution supported on I . Indeed, it is also necessary, and applies to a variety of other (cid:96) 1 models. The paper also discusses ways to recognize unique solutions and verify the uniqueness conditions numerically.

[1]  Yin Zhang,et al.  Theory of Compressive Sensing via ℓ1-Minimization: a Non-RIP Analysis and Extensions , 2013 .

[2]  Dirk A. Lorenz,et al.  Constructing Test Instances for Basis Pursuit Denoising , 2011, IEEE Transactions on Signal Processing.

[3]  R. Tibshirani The Lasso Problem and Uniqueness , 2012, 1206.0313.

[4]  Emmanuel J. Candès,et al.  Simple Bounds for Low-complexity Model Reconstruction , 2011, ArXiv.

[5]  O. Scherzer,et al.  Necessary and sufficient conditions for linear convergence of ℓ1‐regularization , 2011 .

[6]  Emmanuel J. Candès,et al.  A Probabilistic and RIPless Theory of Compressed Sensing , 2010, IEEE Transactions on Information Theory.

[7]  Michael Elad,et al.  From Sparse Solutions of Systems of Equations to Sparse Modeling of Signals and Images , 2009, SIAM Rev..

[8]  R. DeVore,et al.  Compressed sensing and best k-term approximation , 2008 .

[9]  E. Candès,et al.  Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information , 2004, IEEE Transactions on Information Theory.

[10]  Jean-Jacques Fuchs,et al.  Recovery of exact sparse representations in the presence of bounded noise , 2005, IEEE Transactions on Information Theory.

[11]  Emmanuel J. Candès,et al.  Decoding by linear programming , 2005, IEEE Transactions on Information Theory.

[12]  J. Tropp Recovery of short, complex linear combinations via 𝓁1 minimization , 2005, IEEE Trans. Inf. Theory.

[13]  Jean-Jacques Fuchs,et al.  On sparse representations in arbitrary redundant bases , 2004, IEEE Transactions on Information Theory.

[14]  R. Tibshirani,et al.  Least angle regression , 2004, math/0406456.

[15]  Michael Elad,et al.  Optimally sparse representation in general (nonorthogonal) dictionaries via ℓ1 minimization , 2003, Proceedings of the National Academy of Sciences of the United States of America.

[16]  Michael Elad,et al.  A generalized uncertainty principle and sparse representation in pairs of bases , 2002, IEEE Trans. Inf. Theory.

[17]  Xiaoming Huo,et al.  Uncertainty principles and ideal atomic decomposition , 2001, IEEE Trans. Inf. Theory.

[18]  Michael A. Saunders,et al.  Atomic Decomposition by Basis Pursuit , 1998, SIAM J. Sci. Comput..

[19]  R. Tibshirani Regression Shrinkage and Selection via the Lasso , 1996 .

[20]  Yinyu Ye,et al.  Convergence behavior of interior-point algorithms , 1993, Math. Program..

[21]  M. Best,et al.  Sensitivity Analysis for Mean-Variance Portfolio Problems , 1991 .