Multikernel Adaptive Filters Under the Minimum Cauchy Kernel Loss Criterion

The Cauchy loss has been successfully applied in robust learning algorithms in the presence of large outliers, but it may suffer from performance degradation in complex nonlinear tasks. To address this issue, by transforming the original data into the reproducing kernel Hilbert spaces (RKHS) with the kernel trick, a novel Cauchy kernel loss is developed in such a kernel space. Based on the minimum Cauchy kernel loss criterion, the multikernel minimum Cauchy kernel loss (MKMCKL) algorithm is proposed by mapping the input data into the multiple RKHS. The proposed MKMCKL algorithm can provide the performance improvement of the kernel adaptive filter (KAF) based on a single kernel, and also improve the stability of the multikernel adaptive filter based on the quadratic loss in impulsive noises, efficiently. To further curb the growth of network of MKMCKL, a novel sparsification method is presented to prune redundant data, thus reducing its computational and storage burdens. Simulations on different nonlinear applications illustrate the performance superiorities of the proposed algorithms in impulsive noises.

[1]  Masahiro Yukawa,et al.  Multikernel Adaptive Filtering with Double Regularization (無線通信システム) , 2013 .

[2]  A. Atiya,et al.  Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond , 2005, IEEE Transactions on Neural Networks.

[3]  Benjamin Recht,et al.  Random Features for Large-Scale Kernel Machines , 2007, NIPS.

[4]  Eva Herrmann,et al.  Local Bandwidth Choice in Kernel Regression Estimation , 1997 .

[5]  Eweda Eweda,et al.  Stabilization of High-Order Stochastic Gradient Adaptive Filtering Algorithms , 2017, IEEE Transactions on Signal Processing.

[6]  Shiyuan Wang,et al.  Robust least mean logarithmic square adaptive filtering algorithms , 2019, J. Frankl. Inst..

[7]  José Carlos Príncipe,et al.  Mixture kernel least mean square , 2013, The 2013 International Joint Conference on Neural Networks (IJCNN).

[8]  Jie Shen,et al.  A Cauchy estimator test for autocorrelation , 2015 .

[9]  Larry S. Davis,et al.  Truncated Cauchy Non-Negative Matrix Factorization , 2019, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[10]  Nanning Zheng,et al.  Kernel least mean square with adaptive kernel size , 2014, Neurocomputing.

[11]  Fei Wang,et al.  Kernel adaptive filtering under generalized Maximum Correntropy Criterion , 2016, 2016 International Joint Conference on Neural Networks (IJCNN).

[12]  Weifeng Liu,et al.  The Kernel Least-Mean-Square Algorithm , 2008, IEEE Transactions on Signal Processing.

[13]  Nanning Zheng,et al.  Kernel Risk-Sensitive Loss: Definition, Properties and Application to Robust Adaptive Filtering , 2016, IEEE Transactions on Signal Processing.

[14]  Chiou-Shann Fuh,et al.  Multiple Kernel Learning for Dimensionality Reduction , 2011, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[15]  Ignacio Santamaría,et al.  A Sliding-Window Kernel RLS Algorithm and Its Application to Nonlinear Channel Identification , 2006, 2006 IEEE International Conference on Acoustics Speech and Signal Processing Proceedings.

[16]  Peter Xiaoping Liu,et al.  Truncation Error Compensation in Kernel Machines , 2013, 2013 IEEE International Conference on Systems, Man, and Cybernetics.

[17]  Wentao Ma,et al.  Robust kernel adaptive filters based on mean p-power error for noisy chaotic time series prediction , 2017, Eng. Appl. Artif. Intell..

[18]  Badong Chen,et al.  Quantized Kernel Least Mean Square Algorithm , 2012, IEEE Transactions on Neural Networks and Learning Systems.

[19]  José Carlos Príncipe,et al.  Using Correntropy as a cost function in linear adaptive filters , 2009, 2009 International Joint Conference on Neural Networks.

[20]  Weifeng Liu,et al.  Correntropy: Properties and Applications in Non-Gaussian Signal Processing , 2007, IEEE Transactions on Signal Processing.

[21]  Chi K. Tse,et al.  Random Fourier Filters Under Maximum Correntropy Criterion , 2018, IEEE Transactions on Circuits and Systems I: Regular Papers.

[22]  Wolfgang Härdle,et al.  Applied Nonparametric Regression , 1991 .

[23]  Xinge You,et al.  Kernel Learning for Dynamic Texture Synthesis , 2016, IEEE Transactions on Image Processing.

[24]  Chi K. Tse,et al.  Logarithmic Hyperbolic Cosine Adaptive Filter and Its Performance Analysis , 2021, IEEE Transactions on Systems, Man, and Cybernetics: Systems.

[25]  Sun-Yuan Kung,et al.  Multikernel Least Mean Square Algorithm , 2014, IEEE Transactions on Neural Networks and Learning Systems.

[26]  Weifeng Liu,et al.  An Information Theoretic Approach of Designing Sparse Kernel Adaptive Filters , 2009, IEEE Transactions on Neural Networks.

[27]  Armin Dekorsy,et al.  Distributed Adaptive Learning With Multiple Kernels in Diffusion Networks , 2018, IEEE Transactions on Signal Processing.

[28]  Shie Mannor,et al.  The kernel recursive least-squares algorithm , 2004, IEEE Transactions on Signal Processing.

[29]  Ran He,et al.  Maximum Correntropy Criterion for Robust Face Recognition , 2011, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[30]  V. J. Mathews,et al.  Improved convergence analysis of stochastic gradient adaptive filters using the sign algorithm , 1987, IEEE Trans. Acoust. Speech Signal Process..

[31]  Suleyman Serdar Kozat,et al.  A Novel Family of Adaptive Filtering Algorithms Based on the Logarithmic Cost , 2013, IEEE Transactions on Signal Processing.

[32]  Badong Chen,et al.  Multiple adaptive kernel size KLMS for Beijing PM2.5 prediction , 2016, 2016 International Joint Conference on Neural Networks (IJCNN).

[33]  Badong Chen,et al.  Robust Normalized Least Mean Absolute Third Algorithms , 2019, IEEE Access.

[34]  Junghui Chen,et al.  Correntropy Kernel Learning for Nonlinear System Identification with Outliers , 2014 .

[35]  Shiyuan Wang,et al.  The Online Random Fourier Features Conjugate Gradient Algorithm , 2019, IEEE Signal Processing Letters.

[36]  Weifeng Liu,et al.  Kernel Adaptive Filtering: A Comprehensive Introduction , 2010 .

[37]  Senjian An,et al.  Fast cross-validation algorithms for least squares support vector machine and kernel ridge regression , 2007, Pattern Recognit..

[38]  Lehel Csató,et al.  Sparse On-Line Gaussian Processes , 2002, Neural Computation.

[39]  Rong Jin,et al.  Multiple Kernel Learning for Visual Object Recognition: A Review , 2014, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[40]  Nanning Zheng,et al.  Robust Learning With Kernel Mean $p$ -Power Error Loss , 2016, IEEE Transactions on Cybernetics.

[41]  John C. Platt A Resource-Allocating Network for Function Interpolation , 1991, Neural Computation.

[42]  Weifeng Liu,et al.  Fixed-budget kernel recursive least-squares , 2010, 2010 IEEE International Conference on Acoustics, Speech and Signal Processing.

[43]  Binwei Weng,et al.  Nonlinear system identification in impulsive environments , 2005, IEEE Transactions on Signal Processing.

[44]  Nanning Zheng,et al.  Generalized Correntropy for Robust Adaptive Filtering , 2015, IEEE Transactions on Signal Processing.

[45]  Shukai Duan,et al.  Quantized kernel maximum correntropy and its mean square convergence analysis , 2017, Digit. Signal Process..