Denoising sparse noise via online dictionary learning

The idea of learning overcomplete dictionaries based on the paradigm of compressive sensing has found numerous applications, among which image denoising is considered one of the most successful. But many state-of-the-art denoising techniques inherently assume that the signal noise is Gaussian. We instead propose to learn overcomplete dictionaries where the signal is allowed to have both Gaussian and (sparse) Laplacian noise. Dictionary learning in this setting leads to a difficult non-convex optimization problem, which is further exacerbated by large input datasets. We tackle these difficulties by developing an efficient online algorithm that scales to data size. To assess the efficacy of our model, we apply it to dictionary learning for data that naturally satisfy our noise model, namely, Scale Invariant Feature Transform (SIFT) descriptors. For these data, we measure performance of the learned dictionary on the task of nearest-neighbor retrieval: compared to methods that do not explicitly model sparse noise our method exhibits superior performance.

[1]  K. Kreutz-Delgado,et al.  Sparse image coding using learned overcomplete dictionaries , 2004, Proceedings of the 2004 14th IEEE Signal Processing Society Workshop Machine Learning for Signal Processing, 2004..

[2]  Michael Elad,et al.  Image Denoising Via Sparse and Redundant Representations Over Learned Dictionaries , 2006, IEEE Transactions on Image Processing.

[3]  E.J. Candes Compressive Sampling , 2022 .

[4]  D. Narmadha,et al.  A Survey on Image Denoising Techniques , 2012 .

[5]  Serge J. Belongie,et al.  Matching with shape contexts , 2000, 2000 Proceedings Workshop on Content-based Access of Image and Video Libraries.

[6]  Léon Bottou,et al.  On-line learning and stochastic approximations , 1999 .

[7]  A Cherian,et al.  Motion estimation of a miniature helicopter using a single onboard camera , 2010, Proceedings of the 2010 American Control Conference.

[8]  Cordelia Schmid,et al.  Human Detection Using Oriented Histograms of Flow and Appearance , 2006, ECCV.

[9]  Ivan W. Selesnick,et al.  The Estimation of Laplace Random Vectors in Additive White Gaussian Noise , 2008, IEEE Transactions on Signal Processing.

[10]  Olvi L. Mangasarian,et al.  Nonlinear Programming , 1969 .

[11]  Hans-Jörg Schek,et al.  A Quantitative Analysis and Performance Study for Similarity-Search Methods in High-Dimensional Spaces , 1998, VLDB.

[12]  Narayanaswamy Balakrishnan,et al.  Advances in Distribution Theory, Order Statistics, and Inference , 2007 .

[13]  Piotr Indyk,et al.  Approximate nearest neighbors: towards removing the curse of dimensionality , 1998, STOC '98.

[14]  Guillermo Sapiro,et al.  Online Learning for Matrix Factorization and Sparse Coding , 2009, J. Mach. Learn. Res..

[15]  Cordelia Schmid,et al.  Hamming Embedding and Weak Geometric Consistency for Large Scale Image Search , 2008, ECCV.

[16]  G LoweDavid,et al.  Distinctive Image Features from Scale-Invariant Keypoints , 2004 .

[17]  W. Reed The Normal-Laplace Distribution and Its Relatives , 2006 .

[18]  L. Eon Bottou Online Learning and Stochastic Approximations , 1998 .

[19]  K. Schittkowski,et al.  NONLINEAR PROGRAMMING , 2022 .

[20]  Piotr Indyk,et al.  Approximate Nearest Neighbor: Towards Removing the Curse of Dimensionality , 2012, Theory Comput..