MMSE Estimation of Sparse Lévy Processes

We investigate a stochastic signal-processing framework for signals with sparse derivatives, where the samples of a Lévy process are corrupted by noise. The proposed signal model covers the well-known Brownian motion and piecewise-constant Poisson process; moreover, the Lévy family also contains other interesting members exhibiting heavy-tail statistics that fulfill the requirements of compressibility. We characterize the maximum-a-posteriori probability (MAP) and minimum mean-square error (MMSE) estimators for such signals. Interestingly, some of the MAP estimators for the Lévy model coincide with popular signal-denoising algorithms (e.g., total-variation (TV) regularization). We propose a novel non-iterative implementation of the MMSE estimator based on the belief-propagation (BP) algorithm performed in the Fourier domain. Our algorithm takes advantage of the fact that the joint statistics of general Lévy processes are much easier to describe by their characteristic function, as the probability densities do not always admit closed-form expressions. We then use our new estimator as a benchmark to compare the performance of existing algorithms for the optimal recovery of gradient-sparse signals.

[1]  Andrea Montanari,et al.  Message-passing algorithms for compressed sensing , 2009, Proceedings of the National Academy of Sciences.

[2]  D. Donoho,et al.  Sparse MRI: The application of compressed sensing for rapid MR imaging , 2007, Magnetic resonance in medicine.

[3]  Michael I. Jordan,et al.  Graphical Models, Exponential Families, and Variational Inference , 2008, Found. Trends Mach. Learn..

[4]  L. Rudin,et al.  Nonlinear total variation based noise removal algorithms , 1992 .

[5]  Thierry Blu,et al.  The SURE-LET Approach to Image Denoising , 2007, IEEE Transactions on Image Processing.

[6]  David P. Wipf,et al.  Iterative Reweighted 1 and 2 Methods for Finding Sparse Solutions , 2010, IEEE J. Sel. Top. Signal Process..

[7]  Sundeep Rangan,et al.  Message-Passing De-Quantization With Applications to Compressed Sensing , 2012, IEEE Transactions on Signal Processing.

[8]  Richard G. Baraniuk,et al.  Bayesian Compressive Sensing Via Belief Propagation , 2008, IEEE Transactions on Signal Processing.

[9]  Rüdiger L. Urbanke,et al.  The capacity of low-density parity-check codes under message-passing decoding , 2001, IEEE Trans. Inf. Theory.

[10]  Michael Unser,et al.  Wavelet Shrinkage With Consistent Cycle Spinning Generalizes Total Variation Denoising , 2012, IEEE Signal Processing Letters.

[11]  Michael Unser,et al.  A unified formulation of Gaussian vs. sparse stochastic processes - Part II: Discrete-domain theory , 2011, ArXiv.

[12]  Marc Teboulle,et al.  Fast Gradient-Based Algorithms for Constrained Total Variation Image Denoising and Deblurring Problems , 2009, IEEE Transactions on Image Processing.

[13]  C. Stein Estimation of the Mean of a Multivariate Normal Distribution , 1981 .

[14]  Sundeep Rangan,et al.  Message-Passing Estimation from Quantized Samples , 2011, ArXiv.

[15]  Mila Nikolova,et al.  Analysis of the Recovery of Edges in Images and Signals by Minimizing Nonconvex Regularized Least-Squares , 2005, Multiscale Model. Simul..

[16]  David L. Neuhoff,et al.  Quantization , 2022, IEEE Trans. Inf. Theory.

[17]  William T. Freeman,et al.  Understanding belief propagation and its generalizations , 2003 .

[18]  Sundeep Rangan,et al.  Estimation with random linear mixing, belief propagation and compressed sensing , 2010, 2010 44th Annual Conference on Information Sciences and Systems (CISS).

[19]  Volkan Cevher,et al.  Learning with Compressible Priors , 2009, NIPS.

[20]  Ken-iti Sato Lévy Processes and Infinitely Divisible Distributions , 1999 .

[21]  Michael Unser,et al.  Stochastic Models for Sparse and Piecewise-Smooth Signals , 2011, IEEE Transactions on Signal Processing.

[22]  Thierry Blu,et al.  Generalized smoothing splines and the optimal discretization of the Wiener filter , 2005, IEEE Transactions on Signal Processing.

[23]  Stephen P. Boyd,et al.  Enhancing Sparsity by Reweighted ℓ1 Minimization , 2007, 0711.1612.

[24]  Michael Unser,et al.  A unified formulation of Gaussian vs. sparse stochastic processes - Part I: Continuous-domain theory , 2011, ArXiv.

[25]  David Applebaum,et al.  Lévy Processes and Stochastic Calculus by David Applebaum , 2009 .

[26]  Michael Unser,et al.  Compressibility of Deterministic and Random Infinite Sequences , 2011, IEEE Transactions on Signal Processing.

[27]  D. Applebaum Lévy Processes and Stochastic Calculus: Preface , 2009 .

[28]  Thierry Blu,et al.  Self-Similarity: Part II—Optimal Estimation of Fractal Processes , 2007, IEEE Transactions on Signal Processing.

[29]  Sundeep Rangan,et al.  Generalized approximate message passing for estimation with random linear mixing , 2010, 2011 IEEE International Symposium on Information Theory Proceedings.

[30]  佐藤 健一 Lévy processes and infinitely divisible distributions , 2013 .

[31]  Li Ping,et al.  The Factor Graph Approach to Model-Based Signal Processing , 2007, Proceedings of the IEEE.