Compressive sensing predicts that sparse vectors can be recovered efficiently from highly undersampled measurements. While it is well-understood by now that Gaussian random matrices provide optimal measurement matrices in this context, such “highly” random matrices suffer from certain drawbacks: applications require more structure arising from physical or other constraints, and recovery algorithms such as greedy methods or algorithms for `1-minimization demand fast matrix vector multiplies in order to make them feasible for large scale problems. In order to meet such desiderata, we study two types of structured random measurement matrices: partial random circulant matrices, and random sampling matrices associated to bounded orthonormal systems (e.g. random Fourier type matrices). The latter maybe used to study reconstruction problems in high spatial dimensions. Compressive Sensing. A vector x ∈ C is called s-sparse if ‖x‖0 := #{`, x` 6= 0} ≤ s. The `p-norm is defined as usual, ‖x‖p := ( ∑N `=1 |x`|), 0 < p < ∞. The best s-term approximation error of an arbitrary vector in `p is defined as
[1]
Emmanuel J. Candès,et al.
Near-Optimal Signal Recovery From Random Projections: Universal Encoding Strategies?
,
2004,
IEEE Transactions on Information Theory.
[2]
M. Rudelson,et al.
On sparse reconstruction from Fourier and Gaussian measurements
,
2008
.
[3]
E. Candès.
The restricted isometry property and its implications for compressed sensing
,
2008
.
[4]
H. Rauhut.
Compressive Sensing and Structured Random Matrices
,
2009
.
[5]
S. Foucart.
A note on guaranteed sparse recovery via ℓ1-minimization
,
2010
.
[6]
Holger Rauhut,et al.
The Gelfand widths of ℓp-balls for 0
,
2010,
ArXiv.
[7]
Justin K. Romberg,et al.
Restricted Isometries for Partial Random Circulant Matrices
,
2010,
ArXiv.
[8]
Holger Rauhut,et al.
Sparse Legendre expansions via l1-minimization
,
2012,
J. Approx. Theory.