Computable Performance Bounds on Sparse Recovery

In this paper, we develop verifiable sufficient conditions and computable performance bounds of l1-minimization based sparse recovery algorithms in both the noise-free and noisy cases. We define a family of quality measures for arbitrary sensing matrices as a set of optimization problems, and design polynomial-time algorithms with theoretical global convergence guarantees to compute these quality measures. The proposed algorithms solve a series of second-order cone programs, or linear programs. We derive performance bounds on the recovery errors in terms of these quality measures. We also analytically demonstrate that the developed quality measures are non-degenerate for a large class of random sensing matrices, as long as the number of measurements is relatively large. Numerical experiments show that, compared with the restricted isometry based performance bounds, our error bounds apply to a wider range of problems and are tighter, when the sparsity levels of the signals are relatively low.

[1]  Emmanuel J. Candès,et al.  Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information , 2004, IEEE Transactions on Information Theory.

[2]  S. Mendelson,et al.  Reconstruction and Subgaussian Operators in Asymptotic Geometric Analysis , 2007 .

[3]  Emmanuel J. Candès,et al.  Tight Oracle Inequalities for Low-Rank Matrix Recovery From a Minimal Number of Noisy Random Measurements , 2011, IEEE Transactions on Information Theory.

[4]  Arkadi Nemirovski,et al.  On Low Rank Matrix Approximations with Applications to Synthesis Problem in Compressed Sensing , 2010, SIAM J. Matrix Anal. Appl..

[5]  Alexandre d'Aspremont,et al.  Testing the nullspace property using semidefinite programming , 2008, Math. Program..

[6]  Michael A. Saunders,et al.  Atomic Decomposition by Basis Pursuit , 1998, SIAM J. Sci. Comput..

[7]  Marc E. Pfetsch,et al.  The Computational Complexity of the Restricted Isometry Property, the Nullspace Property, and Related Concepts in Compressed Sensing , 2012, IEEE Transactions on Information Theory.

[8]  D. Donoho,et al.  Sparse MRI: The application of compressed sensing for rapid MR imaging , 2007, Magnetic resonance in medicine.

[9]  R. Baraniuk,et al.  Compressive Radar Imaging , 2007, 2007 IEEE Radar Conference.

[10]  Michael I. Jordan,et al.  A Direct Formulation for Sparse Pca Using Semidefinite Programming , 2004, SIAM Rev..

[11]  Pierre Vandergheynst,et al.  A simple test to check the optimality of a sparse signal approximation , 2006, Signal Process..

[12]  S. Geer,et al.  On the conditions used to prove oracle results for the Lasso , 2009, 0910.0722.

[13]  E.J. Candes,et al.  An Introduction To Compressive Sampling , 2008, IEEE Signal Processing Magazine.

[14]  Gongguo Tang,et al.  Performance Analysis of Sparse Recovery Based on Constrained Minimal Singular Values , 2010, IEEE Transactions on Signal Processing.

[15]  Stephen P. Boyd,et al.  Convex Optimization , 2004, Algorithms and Theory of Computation Handbook.

[16]  Dmitry M. Malioutov,et al.  A sparse signal reconstruction perspective for source localization with sensor arrays , 2005, IEEE Transactions on Signal Processing.

[17]  E. Candes,et al.  11-magic : Recovery of sparse signals via convex programming , 2005 .

[18]  Thomas Strohmer,et al.  High-Resolution Radar via Compressed Sensing , 2008, IEEE Transactions on Signal Processing.

[19]  Terence Tao,et al.  The Dantzig selector: Statistical estimation when P is much larger than n , 2005, math/0506081.

[20]  P. Bickel,et al.  SIMULTANEOUS ANALYSIS OF LASSO AND DANTZIG SELECTOR , 2008, 0801.1095.

[21]  Dustin G. Mixon,et al.  Certifying the Restricted Isometry Property is Hard , 2012, IEEE Transactions on Information Theory.

[22]  E. Candès The restricted isometry property and its implications for compressed sensing , 2008 .

[23]  Gongguo Tang,et al.  Performance Analysis for Sparse Support Recovery , 2009, IEEE Transactions on Information Theory.

[24]  S. Foucart,et al.  Sparsest solutions of underdetermined linear systems via ℓq-minimization for 0 , 2009 .

[25]  Thomas Strohmer,et al.  Compressed sensing radar , 2008, 2008 IEEE International Conference on Acoustics, Speech and Signal Processing.

[26]  Rémi Gribonval,et al.  Restricted Isometry Constants Where $\ell ^{p}$ Sparse Recovery Can Fail for $0≪ p \leq 1$ , 2009, IEEE Transactions on Information Theory.

[27]  R. Tibshirani Regression Shrinkage and Selection via the Lasso , 1996 .

[28]  David L. Donoho,et al.  High-Dimensional Centrally Symmetric Polytopes with Neighborliness Proportional to Dimension , 2006, Discret. Comput. Geom..

[29]  Jan van Leeuwen,et al.  Computational complexity of norm-maximization , 1990, Comb..

[30]  Arkadi Nemirovski,et al.  On verifiable sufficient conditions for sparse signal recovery via ℓ1 minimization , 2008, Math. Program..

[31]  Xiaoming Huo,et al.  Uncertainty principles and ideal atomic decomposition , 2001, IEEE Trans. Inf. Theory.

[32]  Richard Baraniuk,et al.  DNA Array Decoding from Nonlinear Measurements by Belief Propagation , 2007, 2007 IEEE/SP 14th Workshop on Statistical Signal Processing.

[33]  Davies Rémi Gribonval Restricted Isometry Constants Where Lp Sparse Recovery Can Fail for 0 , 2008 .

[34]  R. DeVore,et al.  Compressed sensing and best k-term approximation , 2008 .

[35]  Gongguo Tang,et al.  Multiobjective Optimization of OFDM Radar Waveform for Target Detection , 2011, IEEE Transactions on Signal Processing.