Approximate Message Passing with Universal Denoising

We study compressed sensing (CS) signal reconstruction problems where an input signal is measured via matrix multiplication under additive white Gaussian noise. Our signals are assumed to be stationary and ergodic, but the input statistics are unknown; the goal is to provide reconstruction algorithms that are universal to the input statistics. We present a novel algorithmic framework that combines: (i) the approximate message passing (AMP) CS reconstruction framework, which solves the matrix channel recovery problem by iterative scalar channel denoising; (ii) a universal denoising scheme based on context quantization, which partitions the stationary ergodic signal denoising into independent and identically distributed (i.i.d.) subsequence denoising; and (iii) a density estimation approach that approximates the probability distribution of an i.i.d. sequence by fitting a Gaussian mixture (GM) model. In addition to the algorithmic framework, we provide three contributions: (i) numerical results showing that state evolution holds for non-separable Bayesian sliding-window denoisers; (ii) an i.i.d. denoiser based on a modified GM learning algorithm; and (iii) a universal denoiser that does not require the input signal to be bounded. We provide two implementations of our universal CS recovery algorithm with one being faster and the other being more accurate. The two implementations compare favorably with existing reconstruction algorithms in terms of both reconstruction quality and runtime.

[1]  Andrea Montanari,et al.  The dynamics of message passing on dense graphs, with applications to compressed sensing , 2010, ISIT.

[2]  Andrea Montanari,et al.  Accurate Prediction of Phase Transitions in Compressed Sensing via a Connection to Minimax Denoising , 2011, IEEE Transactions on Information Theory.

[3]  Richard G. Baraniuk,et al.  Bayesian Compressive Sensing Via Belief Propagation , 2008, IEEE Transactions on Signal Processing.

[4]  D. Rajan Probability, Random Variables, and Stochastic Processes , 2017 .

[5]  Dror Baron,et al.  Signal Estimation With Additive Error Metrics in Compressed Sensing , 2012, IEEE Transactions on Information Theory.

[6]  Jorma Rissanen,et al.  The Minimum Description Length Principle in Coding and Modeling , 1998, IEEE Trans. Inf. Theory.

[7]  Florent Krzakala,et al.  Sparse Estimation with the Swept Approximated Message-Passing Algorithm , 2014, ArXiv.

[8]  Andrea Montanari,et al.  Message-passing algorithms for compressed sensing , 2009, Proceedings of the National Academy of Sciences.

[9]  Richard G. Baraniuk,et al.  From Denoising to Compressed Sensing , 2014, IEEE Transactions on Information Theory.

[10]  Richard G. Baraniuk,et al.  Minimum Complexity Pursuit for Universal Compressed Sensing , 2012, IEEE Transactions on Information Theory.

[11]  David L Donoho,et al.  Compressed sensing , 2006, IEEE Transactions on Information Theory.

[12]  J. MacQueen Some methods for classification and analysis of multivariate observations , 1967 .

[13]  Ahmad Beirami,et al.  Mismatched estimation in large linear systems , 2015, 2015 IEEE International Symposium on Information Theory (ISIT).

[14]  Michael A. Saunders,et al.  Atomic Decomposition by Basis Pursuit , 1998, SIAM J. Sci. Comput..

[15]  David L. Donoho,et al.  The Simplest Solution to an Underdetermined System of Linear Equations , 2006, 2006 IEEE International Symposium on Information Theory.

[16]  David L. Donoho,et al.  The Kolmogorov Sampler , 2002 .

[17]  Emmanuel J. Candès,et al.  Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information , 2004, IEEE Transactions on Information Theory.

[18]  Tsachy Weissman,et al.  Universal Denoising of Discrete-time Continuous-Amplitude Signals , 2006, ISIT.

[19]  Mário A. T. Figueiredo,et al.  Gradient Projection for Sparse Reconstruction: Application to Compressed Sensing and Other Inverse Problems , 2007, IEEE Journal of Selected Topics in Signal Processing.

[20]  Arian Maleki,et al.  Minimum complexity pursuit , 2011, 2011 49th Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[21]  Sundeep Rangan,et al.  Generalized approximate message passing for estimation with random linear mixing , 2010, 2011 IEEE International Symposium on Information Theory Proceedings.

[22]  R. Tibshirani Regression Shrinkage and Selection via the Lasso , 1996 .

[23]  Anil K. Jain,et al.  Unsupervised Learning of Finite Mixture Models , 2002, IEEE Trans. Pattern Anal. Mach. Intell..

[24]  Sundeep Rangan,et al.  Inference for Generalized Linear Models via alternating directions and Bethe Free Energy minimization , 2015, ISIT.

[25]  Sundeep Rangan,et al.  2012 IEEE Statistical Signal Processing Workshop (SSP) A GENERALIZED FRAMEWORK FOR LEARNING AND RECOVERY OF STRUCTURED SPARSE SIGNALS , 2022 .

[26]  Andrea Montanari,et al.  Graphical Models Concepts in Compressed Sensing , 2010, Compressed Sensing.

[27]  Dongning Guo,et al.  A single-letter characterization of optimal noisy compressed sensing , 2009, 2009 47th Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[28]  Yanting Ma,et al.  Compressive Imaging via Approximate Message Passing With Image Denoising , 2014, IEEE Transactions on Signal Processing.

[29]  Yanting Ma,et al.  Compressed Sensing via Universal Denoising and Approximate Message Passing , 2014, ArXiv.

[30]  Ramji Venkataramanan,et al.  Capacity-Achieving Sparse Superposition Codes via Approximate Message Passing Decoding , 2015, IEEE Transactions on Information Theory.

[31]  Sundeep Rangan,et al.  Adaptive damping and mean removal for the generalized approximate message passing algorithm , 2014, 2015 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[32]  Michael Unser,et al.  Approximate Message Passing With Consistent Parameter Estimation and Applications to Sparse Learning , 2012, IEEE Transactions on Information Theory.

[33]  Marco F. Duarte,et al.  Universal MAP estimation in compressed sensing , 2011, 2011 49th Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[34]  Deanna Needell,et al.  CoSaMP: Iterative signal recovery from incomplete and inaccurate samples , 2008, ArXiv.

[35]  Dror Baron,et al.  Information Complexity and Estimation , 2011, ArXiv.

[36]  Andrea Montanari,et al.  Message passing algorithms for compressed sensing: I. motivation and construction , 2009, 2010 IEEE Information Theory Workshop on Information Theory (ITW 2010, Cairo).

[37]  Sundeep Rangan,et al.  Asymptotic Analysis of MAP Estimation via the Replica Method and Applications to Compressed Sensing , 2009, IEEE Transactions on Information Theory.

[38]  North Carolina,et al.  Recovery from Linear Measurements with Complexity-Matching Universal Signal Estimation , 2017 .

[39]  Tsachy Weissman,et al.  A context quantization approach to universal denoising , 2009, IEEE Trans. Signal Process..

[40]  Marco F. Duarte,et al.  Complexity-adaptive universal signal estimation for compressed sensing , 2014, 2014 IEEE Workshop on Statistical Signal Processing (SSP).