MMSE Approximation For Sparse Coding Algorithms Using Stochastic Resonance

Sparse coding refers to the pursuit of the sparsest representation of a signal in a typically overcomplete dictionary. From a Bayesian perspective, sparse coding provides a maximum a posteriori estimate of the unknown vector under a sparse prior. In this paper, we suggest enhancing the performance of sparse coding algorithms by a deliberate and controlled contamination of the input with random noise, a phenomenon known as stochastic resonance. The proposed method adds controlled noise to the input and estimates a sparse representation from the perturbed signal. A set of such solutions is then obtained by projecting the original input signal onto the recovered set of supports. We present two variants of the described method, which differ in their final step. The first is a provably convergent approximation to the minimum mean square error (MMSE) estimator, relying on the generative model and applying a weighted average over the recovered solutions. The second is a relaxed variant of the former that simply applies an empirical mean. We show that both methods provide a computationally efficient approximation to the MMSE estimator, which is typically intractable to compute. We demonstrate our findings empirically and provide a theoretical analysis of our method under several different cases.

[1]  Emmanuel J. Candès,et al.  Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information , 2004, IEEE Transactions on Information Theory.

[2]  J. Stoer,et al.  Introduction to Numerical Analysis , 2002 .

[3]  Justin Ziniel,et al.  Fast bayesian matching pursuit , 2008, 2008 Information Theory and Applications Workshop.

[4]  François Chapeau-Blondeau,et al.  Noise-enhanced performance for an optimal Bayesian estimator , 2004, IEEE Transactions on Signal Processing.

[5]  Michael Elad,et al.  Image Denoising Via Sparse and Redundant Representations Over Learned Dictionaries , 2006, IEEE Transactions on Image Processing.

[6]  Michael A. Saunders,et al.  Atomic Decomposition by Basis Pursuit , 1998, SIAM J. Sci. Comput..

[7]  Michael Elad,et al.  Large Inpainting of Face Images With Trainlets , 2016, IEEE Signal Processing Letters.

[8]  N G Stocks,et al.  Information transmission in parallel threshold arrays: suprathreshold stochastic resonance. , 2001, Physical review. E, Statistical, nonlinear, and soft matter physics.

[9]  Alessandro Foi,et al.  Image Denoising by Sparse 3-D Transform-Domain Collaborative Filtering , 2007, IEEE Transactions on Image Processing.

[10]  Christian P. Robert,et al.  Monte Carlo Statistical Methods , 2005, Springer Texts in Statistics.

[11]  Michael Elad,et al.  On MMSE and MAP Denoising Under Sparse Representation Modeling Over a Unitary Dictionary , 2011, IEEE Transactions on Signal Processing.

[12]  Wolfgang Maass,et al.  Noise as a Resource for Computation and Learning in Networks of Spiking Neurons , 2014, Proceedings of the IEEE.

[13]  Joel A. Tropp Average-case analysis of greedy pursuit , 2005, SPIE Optics + Photonics.

[14]  Joel A. Tropp,et al.  Signal Recovery From Random Measurements Via Orthogonal Matching Pursuit , 2007, IEEE Transactions on Information Theory.

[15]  Pramod K. Varshney,et al.  Noise Enhanced Parameter Estimation , 2008, IEEE Transactions on Signal Processing.

[16]  Michael Elad,et al.  Closed-Form MMSE Estimation for Signal Denoising Under Sparse Representation Modeling Over a Unitary Dictionary , 2010, IEEE Transactions on Signal Processing.

[17]  N. Stocks,et al.  Suprathreshold stochastic resonance in multilevel threshold systems , 2000, Physical review letters.

[18]  Michael Elad,et al.  A Weighted Average of Sparse Representations is Better than the Sparsest One Alone , 2008, Structured Decompositions and Efficient Algorithms.

[19]  N G Stocks,et al.  Generic noise-enhanced coding in neuronal arrays. , 2001, Physical review. E, Statistical, nonlinear, and soft matter physics.

[20]  François Chapeau-Blondeau,et al.  Noise-aided SNR amplification by parallel arrays of sensors with saturation , 2006 .

[21]  L. M. Ward,et al.  Stochastic resonance and sensory information processing: a tutorial and review of application , 2004, Clinical Neurophysiology.

[22]  C. Stein Estimation of the Mean of a Multivariate Normal Distribution , 1981 .

[23]  L. Rudin,et al.  Nonlinear total variation based noise removal algorithms , 1992 .

[24]  Balas K. Natarajan,et al.  Sparse Approximate Solutions to Linear Systems , 1995, SIAM J. Comput..

[25]  D. Middleton An Introduction to Statistical Communication Theory , 1960 .

[26]  Nigel G. Stocks,et al.  Suprathreshold stochastic resonance: an exact result for uniformly distributed signal and noise , 2001 .

[27]  David Allingham,et al.  THE APPLICATION OF SUPRATHRESHOLD STOCHASTIC RESONANCE TO COCHLEAR IMPLANT CODING , 2002, The Random and Fluctuating World.

[28]  Olgica Milenkovic,et al.  Subspace Pursuit for Compressive Sensing Signal Reconstruction , 2008, IEEE Transactions on Information Theory.

[29]  Michael Elad,et al.  Stable recovery of sparse overcomplete representations in the presence of noise , 2006, IEEE Transactions on Information Theory.

[30]  Pramod K. Varshney,et al.  Reducing Probability of Decision Error Using Stochastic Resonance , 2006, IEEE Signal Processing Letters.

[31]  Michael Elad,et al.  Trainlets: Dictionary Learning in High Dimensions , 2016, IEEE Transactions on Signal Processing.

[32]  Pramod K. Varshney,et al.  Noise-Enhanced Information Systems , 2014, Proceedings of the IEEE.