Error Entropy, Correntropy and M-Estimation

Minimization of the error entropy (MEE) cost function was introduced for nonlinear and non-Gaussian signal processing. In this paper, we show that this cost function has a close relation to a introduced correntropy criterion and M-estimation, thus it also theoretically explains the robustness of MEE to outliers. Based on this understanding, we propose a modification to the MEE cost function named minimization of error entropy with fiducial points, which sets the bias for MEE in an elegant and robust way. The performance of this new criterion is compared with the original MEE and the mean square error criterion (MSE) in robust regression and short-term prediction of a chaotic time series.

[1]  Peter J. Huber,et al.  Robust Statistics , 2005, Wiley Series in Probability and Statistics.

[2]  William Dumouchel,et al.  Integrating a robust option into a multiple regression computing environment , 1992 .

[3]  Vladimir N. Vapnik,et al.  The Nature of Statistical Learning Theory , 2000, Statistics for Engineering and Information Science.

[4]  Deniz Erdogmus,et al.  An error-entropy minimization algorithm for supervised training of nonlinear adaptive systems , 2002, IEEE Trans. Signal Process..

[5]  Deniz Erdogmus,et al.  Generalized information potential criterion for adaptive system training , 2002, IEEE Trans. Neural Networks.

[6]  Deniz Erdogmus,et al.  Beyond second-order statistics for learning: A pairwise interaction model for entropy estimation , 2002, Natural Computing.

[7]  B. Ripley,et al.  Robust Statistics , 2018, Wiley Series in Probability and Statistics.

[8]  Weifeng Liu,et al.  Correntropy: Properties and Applications in Non-Gaussian Signal Processing , 2007, IEEE Transactions on Signal Processing.