Random Fourier Filters Under Maximum Correntropy Criterion

Random Fourier adaptive filters (RFAFs) project the original data into a high-dimensional random Fourier feature space (RFFS) such that the network structure of filters is fixed while achieving similar performance with kernel adaptive filters. The commonly used error criterion in RFAFs is the well-known minimum mean-square error (MMSE) criterion, which is optimal only under the Gaussian noise assumption. However, the MMSE criterion suffers from instability and performance deterioration in the presence of non-Gaussian noises. To improve the robustness of RFAFs against large outliers, the maximum correntropy criterion (MCC) is applied to RFFS, generating a novel robust random Fourier filter under maximum correntropy (RFFMC). To further improve the filtering accuracy, a random-batch RFFMC (RB-RFFMC) is also presented. In addition, a theoretical analysis on the convergence characteristics and steady-state excess mean-square error of RFFMC and RB-RFFMC is provided to validate their superior performance. Simulation results illustrate that RFFMC and its extension provide desirable filtering performance from the aspects of filtering accuracy and robustness, especially in the presence of impulsive noises.

[1]  Benjamin Recht,et al.  Random Features for Large-Scale Kernel Machines , 2007, NIPS.

[2]  Narendra Ahuja,et al.  Online learning with kernels: Overcoming the growing sum problem , 2012, 2012 IEEE International Workshop on Machine Learning for Signal Processing.

[3]  Ting Hu,et al.  Convergence of Gradient Descent for Minimum Error Entropy Principle in Linear Regression , 2016, IEEE Transactions on Signal Processing.

[4]  Michel Verhaegen,et al.  Online Optimization With Costly and Noisy Measurements Using Random Fourier Expansions , 2018, IEEE Transactions on Neural Networks and Learning Systems.

[5]  Nanning Zheng,et al.  Steady-State Mean-Square Error Analysis for Adaptive Filtering under the Maximum Correntropy Criterion , 2014, IEEE Signal Processing Letters.

[6]  Ali H. Sayed,et al.  Robust Adaptation in Impulsive Noise , 2016, IEEE Transactions on Signal Processing.

[7]  Badong Chen,et al.  Quantized Kernel Least Mean Square Algorithm , 2012, IEEE Transactions on Neural Networks and Learning Systems.

[8]  Badong Chen,et al.  Density-Dependent Quantized Least Squares Support Vector Machine for Large Data Sets , 2017, IEEE Transactions on Neural Networks and Learning Systems.

[9]  Badong Chen,et al.  Mean square convergence analysis for kernel least mean square algorithm , 2012, Signal Process..

[10]  Tokunbo Ogunfunmi,et al.  On the Convergence Behavior of the Affine Projection Algorithm for Adaptive Filters , 2011, IEEE Transactions on Circuits and Systems I: Regular Papers.

[11]  Weifeng Liu,et al.  Correntropy: Properties and Applications in Non-Gaussian Signal Processing , 2007, IEEE Transactions on Signal Processing.

[12]  Sérgio J. M. de Almeida,et al.  A statistical analysis of the affine projection algorithm for unity step size and autoregressive inputs , 2005, IEEE Transactions on Circuits and Systems I: Regular Papers.

[13]  Weifeng Liu,et al.  Kernel Adaptive Filtering , 2010 .

[14]  Weifeng Liu,et al.  The Kernel Least-Mean-Square Algorithm , 2008, IEEE Transactions on Signal Processing.

[15]  Jie Liu,et al.  Mini-Batch Semi-Stochastic Gradient Descent in the Proximal Setting , 2015, IEEE Journal of Selected Topics in Signal Processing.

[16]  Andreas Antoniou,et al.  New Improved Recursive Least-Squares Adaptive-Filtering Algorithms , 2013, IEEE Transactions on Circuits and Systems I: Regular Papers.

[17]  Bijit Kumar Das,et al.  Sparse Adaptive Filtering by an Adaptive Convex Combination of the LMS and the ZA-LMS Algorithms , 2014, IEEE Transactions on Circuits and Systems I: Regular Papers.

[18]  Azam Khalili,et al.  Steady-State Tracking Analysis of Adaptive Filter With Maximum Correntropy Criterion , 2017, Circuits Syst. Signal Process..

[19]  Changshui Zhang,et al.  Dependent Online Kernel Learning With Constant Number of Random Fourier Features , 2015, IEEE Transactions on Neural Networks and Learning Systems.

[20]  Yue Qi,et al.  The Data-Reusing MCC-Based Algorithm and Its Performance Analysis , 2016 .

[21]  Danilo Comminiello,et al.  Nonlinear spline adaptive filtering , 2013, Signal Process..

[22]  Ali H. Sayed,et al.  Mean-square performance of a family of affine projection algorithms , 2004, IEEE Transactions on Signal Processing.

[23]  José Carlos Príncipe,et al.  Using Correntropy as a cost function in linear adaptive filters , 2009, 2009 International Joint Conference on Neural Networks.

[24]  G. Mahendran,et al.  Critical-Path Analysis and Low-Complexity Implementation of the LMS Adaptive Algorithm , 2015 .

[25]  Andreas Antoniou,et al.  Affine-Projection-Like Adaptive-Filtering Algorithms Using Gradient-Based Step Size , 2014, IEEE Transactions on Circuits and Systems I: Regular Papers.

[26]  Lei Xing,et al.  Robustness of Maximum Correntropy Estimation Against Large Outliers , 2017 .

[27]  Chee Kheong Siew,et al.  Universal Approximation using Incremental Constructive Feedforward Networks with Random Hidden Nodes , 2006, IEEE Transactions on Neural Networks.

[28]  Tom Murray,et al.  Predicting sun spots using a layered perceptron neural network , 1996, IEEE Trans. Neural Networks.

[29]  Sheng Zhang,et al.  Adaptive Filtering Under a Variable Kernel Width Maximum Correntropy Criterion , 2017, IEEE Transactions on Circuits and Systems II: Express Briefs.

[30]  Jeff G. Schneider,et al.  On the Error of Random Fourier Features , 2015, UAI.

[31]  José Carlos Príncipe,et al.  A Reproducing Kernel Hilbert Space Framework for Information-Theoretic Learning , 2008, IEEE Transactions on Signal Processing.

[32]  Ali H. Sayed,et al.  Fundamentals Of Adaptive Filtering , 2003 .