Recovering non-negative and combined sparse representations

The non-negative solution to an underdetermined linear system can be uniquely recovered sometimes, even without imposing any additional sparsity constraints. In this paper, we derive conditions under which a unique non-negative solution for such a system can exist, based on the theory of polytopes. Furthermore, we develop the paradigm of combined sparse representations, where only a part of the coefficient vector is constrained to be non-negative, and the rest is unconstrained (general). We analyze the recovery of the unique, sparsest solution, for combined representations, under three different cases of coefficient support knowledge: (a) the non-zero supports of non-negative and general coefficients are known, (b) the non-zero support of general coefficients alone is known, and (c) both the non-zero supports are unknown. For case (c), we propose the combined orthogonal matching pursuit algorithm for coefficient recovery and derive the deterministic sparsity threshold under which recovery of the unique, sparsest coefficient vector is possible. We quantify the order complexity of the algorithms, and examine their performance in exact and approximate recovery of coefficients under various conditions of noise. Furthermore, we also obtain their empirical phase transition characteristics. We show that the basis pursuit algorithm, with partial non-negative constraints, and the proposed greedy algorithm perform better in recovering the unique sparse representation when compared to their unconstrained counterparts. Finally, we demonstrate the utility of the proposed methods in recovering images corrupted by saturation noise.

[1]  C. Lawson,et al.  Solving least squares problems , 1976, Classics in applied mathematics.

[2]  Helmut Bölcskei,et al.  Recovery of Sparsely Corrupted Signals , 2011, IEEE Transactions on Information Theory.

[3]  Michael A. Saunders,et al.  Atomic Decomposition by Basis Pursuit , 1998, SIAM J. Sci. Comput..

[4]  Charles L. Lawson,et al.  Solving least squares problems , 1976, Classics in applied mathematics.

[5]  G. Cecchi,et al.  Sparse Recovery for Protein Mass Spectrometry Data , 2014 .

[6]  Karthikeyan Natesan Ramamurthy,et al.  Improved sparse coding using manifold projections , 2011, 2011 18th IEEE International Conference on Image Processing.

[7]  David L. Donoho,et al.  Counting the Faces of Randomly-Projected Hypercubes and Orthants, with Applications , 2008, Discret. Comput. Geom..

[8]  Joel A. Tropp,et al.  Greed is good: algorithmic results for sparse approximation , 2004, IEEE Transactions on Information Theory.

[9]  A. Berman,et al.  Completely Positive Matrices , 2003 .

[10]  Ao Tang,et al.  A Unique “Nonnegative” Solution to an Underdetermined System: From Vectors to Matrices , 2010, IEEE Transactions on Signal Processing.

[11]  Charles R. Johnson,et al.  Matrix analysis , 1985, Statistical Inference for Engineers and Data Scientists.

[12]  Ran He,et al.  Nonnegative sparse coding for discriminative semi-supervised learning , 2011, CVPR 2011.

[13]  D. Donoho,et al.  Sparse nonnegative solution of underdetermined linear equations by linear programming. , 2005, Proceedings of the National Academy of Sciences of the United States of America.

[14]  Michael Elad,et al.  On the Uniqueness of Nonnegative Sparse Solutions to Underdetermined Systems of Equations , 2008, IEEE Transactions on Information Theory.

[15]  I. Johnstone,et al.  Maximum Entropy and the Nearly Black Object , 1992 .

[16]  R. Tibshirani,et al.  Least angle regression , 2004, math/0406456.

[17]  Pierre Vandergheynst,et al.  A low complexity Orthogonal Matching Pursuit for sparse signal approximation with shift-invariant dictionaries , 2009, 2009 IEEE International Conference on Acoustics, Speech and Signal Processing.

[18]  Guillermo Sapiro,et al.  Online Learning for Matrix Factorization and Sparse Coding , 2009, J. Mach. Learn. Res..

[19]  Richard G. Baraniuk,et al.  Stable Restoration and Separation of Approximately Sparse Signals , 2011, ArXiv.

[20]  E. Kreyszig Introductory Functional Analysis With Applications , 1978 .

[21]  Alexandros G. Dimakis,et al.  Sparse Recovery of Positive Signals with Minimal Expansion , 2009, ArXiv.

[22]  Helmut Bölcskei,et al.  Uncertainty Relations and Sparse Signal Recovery for Pairs of General Signal Sets , 2011, IEEE Transactions on Information Theory.

[23]  Tuomas Virtanen,et al.  Exemplar-Based Sparse Representations for Noise Robust Automatic Speech Recognition , 2011, IEEE Transactions on Audio, Speech, and Language Processing.

[24]  Rémi Gribonval,et al.  Non negative sparse representation for Wiener based source separation with a single sensor , 2003, 2003 IEEE International Conference on Acoustics, Speech, and Signal Processing, 2003. Proceedings. (ICASSP '03)..

[25]  David L. Donoho,et al.  Neighborly Polytopes And Sparse Solution Of Underdetermined Linear Equations , 2005 .

[26]  I. Daubechies,et al.  Sparse and stable Markowitz portfolios , 2007, Proceedings of the National Academy of Sciences.

[27]  Shuicheng Yan,et al.  Learning With $\ell ^{1}$-Graph for Image Analysis , 2010, IEEE Transactions on Image Processing.

[28]  Joel A. Tropp,et al.  Topics in sparse approximation , 2004 .

[29]  James G. Nagy,et al.  Covariance-Preconditioned Iterative Methods for Nonnegatively Constrained Astronomical Imaging , 2005, SIAM J. Matrix Anal. Appl..

[30]  Matthias Hein,et al.  Sparse recovery by thresholded non-negative least squares , 2011, NIPS.

[31]  Michael Elad,et al.  Optimally sparse representation in general (nonorthogonal) dictionaries via ℓ1 minimization , 2003, Proceedings of the National Academy of Sciences of the United States of America.

[32]  David L. Donoho,et al.  Precise Undersampling Theorems , 2010, Proceedings of the IEEE.