Robustness of Maximum Correntropy Estimation Against Large Outliers

The maximum correntropy criterion (MCC) has recently been successfully applied in robust regression, classification and adaptive filtering, where the correntropy is maximized instead of minimizing the well-known mean square error (MSE) to improve the robustness with respect to outliers (or impulsive noises). Considerable efforts have been devoted to develop various robust adaptive algorithms under MCC, but so far little insight has been gained as to how the optimal solution will be affected by outliers. In this work, we study this problem in the context of parameter estimation for a simple linear errors-in-variables (EIV) model where all variables are scalar. Under certain conditions, we derive an upper bound on the absolute value of the estimation error and show that the optimal solution under MCC can be very close to the true value of the unknown parameter even with outliers (whose values can be arbitrarily large) in both input and output variables. Illustrative examples are presented to verify and clarify the theory.

[1]  Tung-Sang Ng,et al.  Fast least mean M-estimate algorithms for robust adaptive filtering in impulse noise , 2000, 2000 10th European Signal Processing Conference.

[2]  Nanning Zheng,et al.  A variable step-size adaptive algorithm under maximum correntropy criterion , 2015, 2015 International Joint Conference on Neural Networks (IJCNN).

[3]  C. Heij,et al.  Global total least squares modeling of multivariable time series , 1995, IEEE Trans. Autom. Control..

[4]  Nanning Zheng,et al.  Steady-State Mean-Square Error Analysis for Adaptive Filtering under the Maximum Correntropy Criterion , 2014, IEEE Signal Processing Letters.

[5]  B. Moor,et al.  A unifying theorem for linear and total linear least squares , 1990 .

[6]  Nanning Zheng,et al.  Correntropy Maximization via ADMM: Application to Robust Hyperspectral Unmixing , 2016, IEEE Transactions on Geoscience and Remote Sensing.

[7]  Nanning Zheng,et al.  Convergence of a Fixed-Point Algorithm under Maximum Correntropy Criterion , 2015, IEEE Signal Processing Letters.

[8]  Liming Shi,et al.  Convex Combination of Adaptive Filters under the Maximum Correntropy Criterion in Impulsive Interference , 2014, IEEE Signal Processing Letters.

[9]  José Carlos Príncipe,et al.  Using Correntropy as a cost function in linear adaptive filters , 2009, 2009 International Joint Conference on Neural Networks.

[10]  Torsten Söderström,et al.  Errors-in-variables methods in system identification , 2018, Autom..

[11]  Victor J. Yohai,et al.  The Breakdown Point of Simultaneous General M Estimates of Regression and Scale , 1991 .

[12]  Badong Chen,et al.  Maximum Correntropy Estimation Is a Smoothed MAP Estimation , 2012, IEEE Signal Processing Letters.

[13]  Jose C. Principe,et al.  Information Theoretic Learning - Renyi's Entropy and Kernel Perspectives , 2010, Information Theoretic Learning.

[14]  Mohammad Reza Meybodi,et al.  A Study on the Global Convergence Time Complexity of Estimation of Distribution Algorithms , 2005, RSFDGrC.

[15]  S. C. Chan,et al.  Robust M-estimate adaptive filtering , 2001 .

[16]  Nanning Zheng,et al.  Generalized Correntropy for Robust Adaptive Filtering , 2015, IEEE Transactions on Signal Processing.

[17]  Guangyu Wang,et al.  Breakdown points of t-type regression estimators , 2000 .

[18]  Xu Peiliang,et al.  Overview of Total Least Squares Methods , 2013 .

[19]  He Xuming,et al.  Breakdown points of t-type regression estimatorsBY , 1999 .

[20]  C. L. Nikias,et al.  Signal processing with fractional lower order moments: stable processes and their applications , 1993, Proc. IEEE.

[21]  Badong Chen,et al.  Kernel adaptive filtering with maximum correntropy criterion , 2011, The 2011 International Joint Conference on Neural Networks.

[22]  Weifeng Liu,et al.  Correntropy: Properties and Applications in Non-Gaussian Signal Processing , 2007, IEEE Transactions on Signal Processing.

[23]  V. Yohai HIGH BREAKDOWN-POINT AND HIGH EFFICIENCY ROBUST ESTIMATES FOR REGRESSION , 1987 .

[24]  Zongze Wu,et al.  Robust Hammerstein Adaptive Filtering under Maximum Correntropy Criterion , 2015, Entropy.

[25]  José Carlos Príncipe,et al.  A loss function for classification based on a robust similarity metric , 2010, The 2010 International Joint Conference on Neural Networks (IJCNN).

[26]  Xin Yao,et al.  On the analysis of average time complexity of estimation of distribution algorithms , 2007, 2007 IEEE Congress on Evolutionary Computation.

[27]  Shing-Chow Chan,et al.  A recursive least M-estimate algorithm for robust adaptive filtering in impulsive noise: fast algorithm and convergence performance analysis , 2004, IEEE Transactions on Signal Processing.

[28]  David E. Tyler,et al.  On the finite sample breakdown points of redescending M-estimates of location , 2004 .

[29]  J. Willems,et al.  Application of structured total least squares for system identification and model reduction , 2005, IEEE Transactions on Automatic Control.

[30]  Gene H. Golub,et al.  An analysis of the total least squares problem , 1980, Milestones in Matrix Computation.

[31]  P. J. Huber Finite Sample Breakdown of $M$- and $P$-Estimators , 1984 .

[32]  Tieniu Tan,et al.  Half-Quadratic-Based Iterative Minimization for Robust Sparse Representation , 2014, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[33]  D. Pollard Asymptotics for Least Absolute Deviation Regression Estimators , 1991, Econometric Theory.

[34]  Zongze Wu,et al.  Kernel recursive maximum correntropy , 2015, Signal Process..

[35]  J. Powell,et al.  Least absolute deviations estimation for the censored regression model , 1984 .

[36]  Qingfu Zhang,et al.  On the convergence of a class of estimation of distribution algorithms , 2004, IEEE Transactions on Evolutionary Computation.

[37]  Ran He,et al.  Robust Principal Component Analysis Based on Maximum Correntropy Criterion , 2011, IEEE Transactions on Image Processing.

[38]  Sabine Van Huffel,et al.  Total least squares problem - computational aspects and analysis , 1991, Frontiers in applied mathematics.

[39]  Tieniu Tan,et al.  Robust Recovery of Corrupted Low-RankMatrix by Implicit Regularizers , 2014, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[40]  Badong Chen,et al.  System Parameter Identification: Information Criteria and Algorithms , 2013 .

[41]  Liang Peng,et al.  Least absolute deviations estimation for ARCH and GARCH models , 2003 .

[42]  Peter J. Rousseeuw,et al.  Robust Regression and Outlier Detection , 2005, Wiley Series in Probability and Statistics.

[43]  Xi Liu,et al.  > Replace This Line with Your Paper Identification Number (double-click Here to Edit) < , 2022 .