A Robust Fully Correntropy-Based Sparse Modeling Alternative to Dictionary Learning

Correntropy is a dependence measure that goes beyond Gaussian environments and optimizations based on Minimum Squared Error (MSE). Its ability to induce a metric that is fully modulated by a single parameter makes it an attractive tool for adaptive signal processing. We propose a sparse modeling framework based on the dictionary learning technique known as K–SVD where Correntropy replaces MSE in the sparse coding and dictionary update subroutines. The former yields a robust variant of Orthogonal Matching Pursuit while the latter exploits robust Singular Value Decompositions. The result is Correntropy–based dictionary learning. The data–driven nature of the approach combines two appealing features in unsupervised learning—robustness and sparseness—without adding hyperparameters to the framework. Robust recovery of bases in synthetic data and image denoising under impulsive noise confirm the advantages of the proposed techniques.

[1]  Emmanuel J. Candès,et al.  Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information , 2004, IEEE Transactions on Information Theory.

[2]  M. Elad,et al.  $rm K$-SVD: An Algorithm for Designing Overcomplete Dictionaries for Sparse Representation , 2006, IEEE Transactions on Signal Processing.

[3]  Weifeng Liu,et al.  Correntropy: Properties and Applications in Non-Gaussian Signal Processing , 2007, IEEE Transactions on Signal Processing.

[4]  Mila Nikolova,et al.  Analysis of Half-Quadratic Minimization Methods for Signal and Image Recovery , 2005, SIAM J. Sci. Comput..

[5]  Guillermo Sapiro,et al.  Non-local sparse models for image restoration , 2009, 2009 IEEE 12th International Conference on Computer Vision.

[6]  Yuan Yan Tang,et al.  Correntropy Matching Pursuit With Application to Robust Digit and Face Recognition , 2017, IEEE Transactions on Cybernetics.

[7]  Michael Elad,et al.  Image Denoising Via Sparse and Redundant Representations Over Learned Dictionaries , 2006, IEEE Transactions on Image Processing.

[8]  David J. Field,et al.  Emergence of simple-cell receptive field properties by learning a sparse code for natural images , 1996, Nature.

[9]  Jorma Rissanen,et al.  The Minimum Description Length Principle in Coding and Modeling , 1998, IEEE Trans. Inf. Theory.

[10]  B. Silverman Density estimation for statistics and data analysis , 1986 .

[11]  Robert Andersen Modern Methods for Robust Regression , 2007 .

[12]  H. Akaike,et al.  Information Theory and an Extension of the Maximum Likelihood Principle , 1973 .

[13]  Ran He,et al.  Robust Principal Component Analysis Based on Maximum Correntropy Criterion , 2011, IEEE Transactions on Image Processing.

[14]  R. Tibshirani Regression Shrinkage and Selection via the Lasso , 1996 .

[15]  Joel A. Tropp,et al.  Signal Recovery From Random Measurements Via Orthogonal Matching Pursuit , 2007, IEEE Transactions on Information Theory.

[16]  I. Johnstone,et al.  Ideal spatial adaptation by wavelet shrinkage , 1994 .

[17]  Stéphane Mallat,et al.  Matching pursuits with time-frequency dictionaries , 1993, IEEE Trans. Signal Process..

[18]  L. Rudin,et al.  Nonlinear total variation based noise removal algorithms , 1992 .

[19]  Michael Elad,et al.  Sparse Representation for Color Image Restoration , 2008, IEEE Transactions on Image Processing.

[20]  José Carlos Príncipe,et al.  A robust maximum correntropy criterion for dictionary learning , 2016, 2016 IEEE 26th International Workshop on Machine Learning for Signal Processing (MLSP).