Deterministic Compressive Sensing with Groups of Random Variables

Compressed Sensing aims to capture attributes of a sparse signal using very few measurements. Candès and Tao showed that random matrices satisfy the Restricted Isometry Property or RIP property with high probability. They showed that this condition is sufficient for sparse reconstruction and that random matrices, where the entries are generated by an iid Gaussian or Bernoulli process, satisfy the RIP with high probability. This approach treats all k-sparse signals as equally likely, in contrast to mainstream signal processing where the filtering is deterministic, and the signal is described probabilistically. This paper provides weak conditions that are sufficient to show that a deterministic sensing matrix satisfies the Statistical Restricted Isometry Property (StRIP), and to guarantee the uniqueness of k-sparse representations. The columns of the sensing matrix are required to form a group under pointwise multiplication, and it is this property that provides the concentration inequalities of McDiarmid and Bernstein with the leverage necessary to guarantee uniqueness of sparse representations. We provide a large class of deterministic sensing matrices for which Basis Pursuit is able to recover sparse Steinhaus signals. The new framework encompasses many families of deterministic sensing matrices, including those formed from discrete chirps, Delsarte-Goethals codes, and Extended BCH codes. In these cases it provides theoretical guarantees on the performance of nonlinear reconstruction algorithms with complexity that is only quadratic in the dimension of the measurement domain.

[1]  Shamgar Gurevich,et al.  Incoherent dictionaries and the statistical restricted isometry property , 2008, ArXiv.

[2]  Noga Alon,et al.  The Space Complexity of Approximating the Frequency Moments , 1999 .

[3]  A. Robert Calderbank,et al.  A fast reconstruction algorithm for deterministic compressive sensing using second order reed-muller codes , 2008, 2008 42nd Annual Conference on Information Sciences and Systems.

[4]  Richard G. Baraniuk,et al.  Random Filters for Compressive Sampling and Reconstruction , 2006, 2006 IEEE International Conference on Acoustics Speech and Signal Processing Proceedings.

[5]  Olgica Milenkovic,et al.  Subspace Pursuit for Compressive Sensing: Closing the Gap Between Performance and Complexity , 2008, ArXiv.

[6]  R. DeVore,et al.  Compressed sensing and best k-term approximation , 2008 .

[7]  Venkatesan Guruswami,et al.  Improved decoding of Reed-Solomon and algebraic-geometry codes , 1999, IEEE Trans. Inf. Theory.

[8]  Piotr Indyk,et al.  Combining geometry and combinatorics: A unified approach to sparse signal recovery , 2008, 2008 46th Annual Allerton Conference on Communication, Control, and Computing.

[9]  Emmanuel J. Candès,et al.  Near-Optimal Signal Recovery From Random Projections: Universal Encoding Strategies? , 2004, IEEE Transactions on Information Theory.

[10]  Adi Akavia,et al.  Finding Significant Fourier Transform Coefficients Deterministically and Locally , 2008, Electron. Colloquium Comput. Complex..

[11]  David L Donoho,et al.  Compressed sensing , 2006, IEEE Transactions on Information Theory.

[12]  Jean-Jacques Fuchs,et al.  On sparse representations in arbitrary redundant bases , 2004, IEEE Transactions on Information Theory.

[13]  A. Robert Calderbank,et al.  Efficient and Robust Compressed Sensing using High-Quality Expander Graphs , 2008, ArXiv.

[14]  Martin J. Wainwright,et al.  Sharp thresholds for high-dimensional and noisy recovery of sparsity , 2006, ArXiv.

[15]  A. Robert Calderbank,et al.  Construction of a Large Class of Deterministic Sensing Matrices That Satisfy a Statistical Isometry Property , 2009, IEEE Journal of Selected Topics in Signal Processing.

[16]  Deanna Needell,et al.  CoSaMP: Iterative signal recovery from incomplete and inaccurate samples , 2008, ArXiv.

[17]  J. Tropp Norms of Random Submatrices and Sparse Approximation , 2008 .

[18]  Vahid Tarokh,et al.  A Frame Construction and a Universal Distortion Bound for Sparse Representations , 2008, IEEE Transactions on Signal Processing.

[19]  J. Tropp On the conditioning of random subdictionaries , 2008 .

[20]  E. Candès,et al.  Stable signal recovery from incomplete and inaccurate measurements , 2005, math/0503066.

[21]  Piotr Indyk Explicit constructions for compressed sensing of sparse signals , 2008, SODA '08.

[22]  R. Vershynin,et al.  One sketch for all: fast algorithms for compressed sensing , 2007, STOC '07.

[23]  Enkatesan G Uruswami Unbalanced expanders and randomness extractors from Parvaresh-Vardy codes , 2008 .

[24]  Ronald A. DeVore,et al.  Deterministic constructions of compressed sensing matrices , 2007, J. Complex..

[25]  Emmanuel J. Candès,et al.  Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information , 2004, IEEE Transactions on Information Theory.

[26]  James R. Lee,et al.  Almost Euclidean subspaces of e N 1 via expander codes , 2008, SODA 2008.

[27]  R. Calderbank,et al.  Chirp sensing codes: Deterministic compressed sensing measurements for fast recovery , 2009 .

[28]  Richard Baraniuk,et al.  Compressed Sensing Reconstruction via Belief Propagation , 2006 .

[29]  M. A. Iwen Simple deterministically constructible RIP matrices with sublinear fourier sampling requirements , 2009, 2009 43rd Annual Conference on Information Sciences and Systems.

[30]  Graham Cormode,et al.  Combinatorial Algorithms for Compressed Sensing , 2006 .

[31]  Stephen J. Wright,et al.  Toeplitz-Structured Compressed Sensing Matrices , 2007, 2007 IEEE/SP 14th Workshop on Statistical Signal Processing.

[32]  Vahid Tarokh,et al.  Shannon-Theoretic Limits on Noisy Compressive Sampling , 2007, IEEE Transactions on Information Theory.