The Noise-Sensitivity Phase Transition in Compressed Sensing

Consider the noisy underdetermined system of linear equations: y = Ax<sub>0</sub> + z, with A an n × N measurement matrix, n <; N, and z ~ N(0, σ<sup>2</sup>I) a Gaussian white noise. Both y and A are known, both x<sub>0</sub> and z are unknown, and we seek an approximation to x<sub>0</sub>. When x<sub>0</sub> has few nonzeros, useful approximations are often obtained by ℓ<sub>1</sub>-penalized ℓ<sub>2</sub> minimization, in which the reconstruction x̂<sup>1,λ</sup> solves min{||y - Ax||<sub>2</sub><sup>2</sup>/2 + λ||x||<sub>1</sub>}. Consider the reconstruction mean-squared error MSE = E|| x̂<sup>1,λ</sup> - x<sub>0</sub>||<sub>2</sub><sup>2</sup>/N, and define the ratio MSE/σ<sup>2</sup> as the noise sensitivity. Consider matrices A with i.i.d. Gaussian entries and a large-system limit in which n, N → ∞ with n/N → δ and k/n → ρ. We develop exact expressions for the asymptotic MSE of x̂<sup>1,λ</sup> , and evaluate its worst-case noise sensitivity over all types of k-sparse signals. The phase space 0 ≤ 8, ρ ≤ 1 is partitioned by the curve ρ = ρ<sub>MSE</sub>(δ) into two regions. Formal noise sensitivity is bounded throughout the region ρ = ρ<sub>MSE</sub>(δ) and is unbounded throughout the region ρ = ρ<sub>MSE</sub>(δ). The phase boundary ρ = ρ<sub>MSE</sub>(δ) is identical to the previously known phase transition curve for equivalence of ℓ<sub>1</sub> - ℓ<sub>0</sub> minimization in the k-sparse noiseless case. Hence, a single phase boundary describes the fundamental phase transitions both for the noise less and noisy cases. Extensive computational experiments validate these predictions, including the existence of game-theoretical structures underlying it (saddlepoints in the payoff, least-favorable signals and maximin penalization). Underlying our formalism is an approximate message passing soft thresholding algorithm (AMP) introduced earlier by the authors. Other papers by the authors detail expressions for the formal MSE of AMP and its close connection to ℓ<sub>1</sub>-penalized reconstruction. The focus of the present paper is on computing the minimax formal MSE within the class of sparse signals x<sup>0</sup>.

[1]  R. Palmer,et al.  Solution of 'Solvable model of a spin glass' , 1977 .

[2]  I. Johnstone,et al.  Maximum Entropy and the Nearly Black Object , 1992 .

[3]  I. Johnstone,et al.  Minimax risk overlp-balls forlp-error , 1994 .

[4]  I. Johnstone,et al.  Minimax Risk over l p-Balls for l q-error , 1994 .

[5]  Scott Chen,et al.  Examples of basis pursuit , 1995, Optics + Photonics.

[6]  S. Kak Information, physics, and computation , 1996 .

[7]  R. Tibshirani Regression Shrinkage and Selection via the Lasso , 1996 .

[8]  Michael A. Saunders,et al.  Atomic Decomposition by Basis Pursuit , 1998, SIAM J. Sci. Comput..

[9]  Brendan J. Frey,et al.  Factor graphs and the sum-product algorithm , 2001, IEEE Trans. Inf. Theory.

[10]  M. Opper,et al.  From Naive Mean Field Theory to the TAP Equations , 2001 .

[11]  M. Talagrand,et al.  Spin Glasses: A Challenge for Mathematicians , 2003 .

[12]  I. Daubechies,et al.  An iterative thresholding algorithm for linear inverse problems with a sparsity constraint , 2003, math/0307152.

[13]  Y. Kabashima A CDMA multiuser detection algorithm on the basis of belief propagation , 2003 .

[14]  E. Candès,et al.  Stable signal recovery from incomplete and inaccurate measurements , 2005, math/0503066.

[15]  David Saad,et al.  Improved message passing for inference in densely connected systems , 2005, ArXiv.

[16]  William T. Freeman,et al.  Constructing free-energy approximations and generalized belief propagation algorithms , 2005, IEEE Transactions on Information Theory.

[17]  Emmanuel J. Candès,et al.  Decoding by linear programming , 2005, IEEE Transactions on Information Theory.

[18]  D. Donoho,et al.  Neighborliness of randomly projected simplices in high dimensions. , 2005, Proceedings of the National Academy of Sciences of the United States of America.

[19]  Andrea Montanari,et al.  Belief Propagation Based Multi--User Detection , 2005, ArXiv.

[20]  Andrea Montanari,et al.  Analysis of Belief Propagation for Non-Linear Problems: The Example of CDMA (or: How to Prove Tanaka's Formula) , 2006, 2006 IEEE Information Theory Workshop - ITW '06 Punta del Este.

[21]  David L. Donoho,et al.  High-Dimensional Centrally Symmetric Polytopes with Neighborliness Proportional to Dimension , 2006, Discret. Comput. Geom..

[22]  Terence Tao,et al.  The Dantzig selector: Statistical estimation when P is much larger than n , 2005, math/0506081.

[23]  Andrea Montanari,et al.  Maxwell Construction: The Hidden Bridge Between Iterative and Maximum a Posteriori Decoding , 2005, IEEE Transactions on Information Theory.

[24]  Rüdiger L. Urbanke,et al.  Modern Coding Theory , 2008 .

[25]  David L. Donoho,et al.  Observed universality of phase transitions in high-dimensional geometry, with implications for modern data analysis and signal processing , 2009, Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences.

[26]  Marc Teboulle,et al.  A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems , 2009, SIAM J. Imaging Sci..

[27]  Andrea Montanari,et al.  Message-passing algorithms for compressed sensing , 2009, Proceedings of the National Academy of Sciences.

[28]  Yoshiyuki Kabashima,et al.  Erratum: A typical reconstruction limit of compressed sensing based on Lp-norm minimization , 2009, ArXiv.

[29]  Weiyu Xu,et al.  On sharp performance bounds for robust sparse signal recoveries , 2009, 2009 IEEE International Symposium on Information Theory.

[30]  Jeffrey D. Blanchard,et al.  THE RESTRICTED ISOMETRY PROPERTY AND ` Q-REGULARIZATION : PHASE TRANSITIONS FOR SPARSE APPROXIMATION , 2009 .

[31]  Dongning Guo,et al.  A single-letter characterization of optimal noisy compressed sensing , 2009, 2009 47th Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[32]  Andrea Montanari,et al.  The Generalized Area Theorem and Some of its Consequences , 2005, IEEE Transactions on Information Theory.

[33]  Andrea Montanari,et al.  The dynamics of message passing on dense graphs, with applications to compressed sensing , 2010, 2010 IEEE International Symposium on Information Theory.

[34]  Andrea Montanari,et al.  Message passing algorithms for compressed sensing: I. motivation and construction , 2009, 2010 IEEE Information Theory Workshop on Information Theory (ITW 2010, Cairo).

[35]  Andrea Montanari,et al.  Compressed Sensing over ℓp-balls: Minimax mean square error , 2011, 2011 IEEE International Symposium on Information Theory Proceedings.

[36]  Jared Tanner,et al.  Explorer Compressed Sensing : How Sharp Is the Restricted Isometry Property ? , 2011 .

[37]  I. Johnstone,et al.  Compressed Sensing over $\ell_p$-balls: Minimax Mean Square Error , 2011 .

[38]  Andrea Montanari,et al.  The LASSO Risk for Gaussian Matrices , 2010, IEEE Transactions on Information Theory.

[39]  Sundeep Rangan,et al.  Asymptotic Analysis of MAP Estimation via the Replica Method and Applications to Compressed Sensing , 2009, IEEE Transactions on Information Theory.