Online performance guarantees for sparse recovery

A K*-sparse vector x* ∈ ℝ<sup>N</sup> produces measurements via linear dimensionality reduction as u = Φx* + n, where Φ ∈ ℝ<sup>M×N</sup> (M &#60; N), and n ∈ ℝ<sup>M</sup> consists of independent and identically distributed, zero mean Gaussian entries with variance σ<sup>2</sup>. An algorithm, after its execution, determines a vector x̂ that has K-nonzero entries, and satisfies ∥u − Φx̂∥ ≤ ε. How far can x̂ be from x*? When the measurement matrix Φ provides stable embedding to 2K-sparse signals (the so-called restricted isometry property), they must be very close. This paper therefore establishes worst-case bounds to characterize the distance ∥x̂ − x*∥ based on the online meta-information. These bounds improve the pre-run algorithmic recovery guarantees, and are quite useful in exploring various data error and solution sparsity trade-offs. We also evaluate the performance of some sparse recovery algorithms in the context of our bound.

[1]  Volkan Cevher,et al.  An ALPS view of sparse recovery , 2011, 2011 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[2]  Lie Wang,et al.  Stable Recovery of Sparse Signals and an Oracle Inequality , 2010, IEEE Transactions on Information Theory.

[3]  Volkan Cevher,et al.  Learning with Compressible Priors , 2009, NIPS.

[4]  Terence Tao,et al.  The Dantzig selector: Statistical estimation when P is much larger than n , 2005, math/0506081.

[5]  H. Rauhut Compressive Sensing and Structured Random Matrices , 2009 .

[6]  M. Rudelson,et al.  Sparse reconstruction by convex relaxation: Fourier and Gaussian measurements , 2006, 2006 40th Annual Conference on Information Sciences and Systems.

[7]  P. Bickel,et al.  SIMULTANEOUS ANALYSIS OF LASSO AND DANTZIG SELECTOR , 2008, 0801.1095.

[8]  Michael Elad,et al.  RIP-Based Near-Oracle Performance Guarantees for SP, CoSaMP, and IHT , 2012, IEEE Transactions on Signal Processing.

[9]  Emmanuel J. Candès,et al.  Near-Optimal Signal Recovery From Random Projections: Universal Encoding Strategies? , 2004, IEEE Transactions on Information Theory.

[10]  Holger Rauhut,et al.  Compressive Sensing with structured random matrices , 2012 .

[11]  Mike E. Davies,et al.  Iterative Hard Thresholding for Compressed Sensing , 2008, ArXiv.

[12]  Deanna Needell,et al.  CoSaMP: Iterative signal recovery from incomplete and inaccurate samples , 2008, ArXiv.

[13]  Emmanuel J. Candès,et al.  Decoding by linear programming , 2005, IEEE Transactions on Information Theory.

[14]  Massimo Fornasier,et al.  Theoretical Foundations and Numerical Methods for Sparse Recovery , 2010, Radon Series on Computational and Applied Mathematics.

[15]  Avi Wigderson,et al.  Expanders That Beat the Eigenvalue Bound: Explicit Construction and Applications , 1993, Comb..

[16]  Marc Teboulle,et al.  A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems , 2009, SIAM J. Imaging Sci..

[17]  Yonina C. Eldar,et al.  Coherence-Based Performance Guarantees for Estimating a Sparse Vector Under Random Noise , 2009, IEEE Transactions on Signal Processing.