Support recovery with sparsely sampled free random matrices

Consider a Bernoulli-Gaussian complex n-vector whose components are X i B i , with B i ∼Bernoulli-q and X i ∼ CN(0; σ2), iid across i and mutually independent. This random q-sparse vector is multiplied by a random matrix U, and a randomly chosen subset of the components of average size np, p ∈ [0; 1], of the resulting vector is then observed in additive Gaussian noise. We extend the scope of conventional noisy compressive sampling models where U is typically the identity or a matrix with iid components, to allow U that satisfies a certain freeness condition, which encompasses Haar matrices and other unitarily invariant matrices. We use the replica method and the decoupling principle of Guo and Verdu, as well as a number of information theoretic bounds, to study the input-output mutual information and the support recovery error rate as n → ∞.

[1]  H. Nishimori Statistical Physics of Spin Glasses and Information Processing , 2001 .

[2]  Harald Cram'er,et al.  Sur un nouveau théorème-limite de la théorie des probabilités , 2018 .

[3]  Galen Reeves,et al.  Sparsity Pattern Recovery in Compressed Sensing , 2011 .

[4]  Andrea Montanari,et al.  The dynamics of message passing on dense graphs, with applications to compressed sensing , 2010, 2010 IEEE International Symposium on Information Theory.

[5]  Georgios B. Giannakis,et al.  Sound Field Reproduction using the Lasso , 2010, IEEE Transactions on Audio, Speech, and Language Processing.

[6]  William Feller,et al.  An Introduction to Probability Theory and Its Applications , 1967 .

[7]  Martin J. Wainwright,et al.  Information-Theoretic Limits on Sparse Signal Recovery: Dense versus Sparse Measurement Matrices , 2008, IEEE Transactions on Information Theory.

[8]  Martin J. Wainwright,et al.  Information-theoretic limits on sparsity recovery in the high-dimensional and noisy setting , 2009, IEEE Trans. Inf. Theory.

[9]  Yihong Wu,et al.  Rényi Information Dimension: Fundamental Limits of Almost Lossless Analog Compression , 2010, IEEE Transactions on Information Theory.

[10]  Galen Reeves,et al.  Approximate Sparsity Pattern Recovery: Information-Theoretic Lower Bounds , 2010, IEEE Transactions on Information Theory.

[11]  Kamiar Rahnama Rad Nearly Sharp Sufficient Conditions on Exact Sparsity Pattern Recovery , 2009, IEEE Transactions on Information Theory.

[12]  A. Guionnet,et al.  A Fourier view on the R-transform and related asymptotics of spherical integrals , 2005 .

[13]  Antonia Maria Tulino,et al.  Random Matrix Theory and Wireless Communications , 2004, Found. Trends Commun. Inf. Theory.

[14]  Vahid Tarokh,et al.  Shannon-Theoretic Limits on Noisy Compressive Sampling , 2007, IEEE Transactions on Information Theory.

[15]  R. Tibshirani Regression Shrinkage and Selection via the Lasso , 1996 .

[16]  Shlomo Shamai,et al.  Mutual information and minimum mean-square error in Gaussian channels , 2004, IEEE Transactions on Information Theory.

[17]  William Feller,et al.  An Introduction to Probability Theory and Its Applications , 1951 .

[18]  Sundeep Rangan,et al.  Necessary and Sufficient Conditions for Sparsity Pattern Recovery , 2008, IEEE Transactions on Information Theory.

[19]  Andrea Montanari,et al.  Message-passing algorithms for compressed sensing , 2009, Proceedings of the National Academy of Sciences.

[20]  Yoshiyuki Kabashima,et al.  Statistical mechanical analysis of a typical reconstruction limit of compressed sensing , 2010, 2010 IEEE International Symposium on Information Theory.

[21]  P. Tseng,et al.  Block Coordinate Relaxation Methods for Nonparametric Wavelet Denoising , 2000 .

[22]  C. Itzykson,et al.  The planar approximation. II , 1980 .

[23]  Sergio Verdu,et al.  MMSE Dimension , 2011, IEEE Trans. Inf. Theory.

[24]  Toshiyuki Tanaka,et al.  A statistical-mechanics approach to large-system analysis of CDMA multiuser detectors , 2002, IEEE Trans. Inf. Theory.

[25]  Venkatesh Saligrama,et al.  Information Theoretic Bounds for Compressed Sensing , 2008, IEEE Transactions on Information Theory.

[26]  Sergio Verdú,et al.  Randomly spread CDMA: asymptotics via statistical physics , 2005, IEEE Transactions on Information Theory.

[27]  Toshiyuki TANAKA,et al.  Generic Multiuser Detection and Statistical Physics , 2009 .

[28]  Dongning Guo,et al.  A single-letter characterization of optimal noisy compressed sensing , 2009, 2009 47th Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[29]  J. Hubbard Calculation of Partition Functions , 1959 .

[30]  Galen Reeves,et al.  The Sampling Rate-Distortion Tradeoff for Sparsity Pattern Recovery in Compressed Sensing , 2010, IEEE Transactions on Information Theory.

[31]  Sergio Verdú,et al.  Optimal Phase Transitions in Compressed Sensing , 2011, IEEE Transactions on Information Theory.

[32]  David L Donoho,et al.  Compressed sensing , 2006, IEEE Transactions on Information Theory.

[33]  R. L. Stratonovich On a Method of Calculating Quantum Distribution Functions , 1957 .

[34]  Toshiyuki TANAKA Asymptotics of Harish-Chandra-Itzykson-Zuber integrals and free probability theory , 2008 .

[35]  Emmanuel J. Candès,et al.  Near-Optimal Signal Recovery From Random Projections: Universal Encoding Strategies? , 2004, IEEE Transactions on Information Theory.

[36]  Shlomo Shamai,et al.  Capacity of Channels With Frequency-Selective and Time-Selective Fading , 2010, IEEE Transactions on Information Theory.

[37]  Michael Elad,et al.  Stable recovery of sparse overcomplete representations in the presence of noise , 2006, IEEE Transactions on Information Theory.

[38]  Sundeep Rangan,et al.  Asymptotic Analysis of MAP Estimation via the Replica Method and Applications to Compressed Sensing , 2009, IEEE Transactions on Information Theory.

[39]  Brendt Wohlberg,et al.  Noise sensitivity of sparse signal representations: reconstruction error bounds for the inverse problem , 2003, IEEE Trans. Signal Process..