Mean square convergence analysis for kernel least mean square algorithm

In this paper, we study the mean square convergence of the kernel least mean square (KLMS). The fundamental energy conservation relation has been established in feature space. Starting from the energy conservation relation, we carry out the mean square convergence analysis and obtain several important theoretical results, including an upper bound on step size that guarantees the mean square convergence, the theoretical steady-state excess mean square error (EMSE), an optimal step size for the fastest convergence, and an optimal kernel size for the fastest initial convergence. Monte Carlo simulation results agree with the theoretical analysis very well.

[1]  Ali H. Sayed,et al.  Fundamentals Of Adaptive Filtering , 2003 .

[2]  Vladimir N. Vapnik,et al.  The Nature of Statistical Learning Theory , 2000, Statistics for Engineering and Information Science.

[3]  Christopher J. C. Burges,et al.  A Tutorial on Support Vector Machines for Pattern Recognition , 1998, Data Mining and Knowledge Discovery.

[4]  Bernhard Schölkopf,et al.  Nonlinear Component Analysis as a Kernel Eigenvalue Problem , 1998, Neural Computation.

[5]  Paul Honeine,et al.  Online Prediction of Time Series Data With Kernels , 2009, IEEE Transactions on Signal Processing.

[6]  Weifeng Liu,et al.  Kernel Affine Projection Algorithms , 2008, EURASIP J. Adv. Signal Process..

[7]  Weiping Li,et al.  Applied Nonlinear Control , 1991 .

[8]  Shie Mannor,et al.  The kernel recursive least-squares algorithm , 2004, IEEE Transactions on Signal Processing.

[9]  Tareq Y. Al-Naffouri,et al.  Transient analysis of adaptive filters with error nonlinearities , 2003, IEEE Trans. Signal Process..

[10]  Weifeng Liu,et al.  Kernel Adaptive Filtering: A Comprehensive Introduction , 2010 .

[11]  A. Atiya,et al.  Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond , 2005, IEEE Transactions on Neural Networks.

[12]  Weifeng Liu,et al.  Extended Kernel Recursive Least Squares Algorithm , 2009, IEEE Transactions on Signal Processing.

[13]  Tomaso A. Poggio,et al.  Regularization Theory and Neural Networks Architectures , 1995, Neural Computation.

[14]  Weifeng Liu,et al.  An Information Theoretic Approach of Designing Sparse Kernel Adaptive Filters , 2009, IEEE Transactions on Neural Networks.

[15]  John C. Platt A Resource-Allocating Network for Function Interpolation , 1991, Neural Computation.

[16]  Ali H. Sayed,et al.  A unified approach to the steady-state and tracking analyses of adaptive filters , 2001, IEEE Trans. Signal Process..

[17]  Vladimir Vapnik,et al.  The Nature of Statistical Learning , 1995 .

[18]  Tareq Y. Al-Naffouri,et al.  Transient analysis of data-normalized adaptive filters , 2003, IEEE Trans. Signal Process..

[19]  N. Aronszajn Theory of Reproducing Kernels. , 1950 .

[20]  Weifeng Liu,et al.  The Kernel Least-Mean-Square Algorithm , 2008, IEEE Transactions on Signal Processing.

[21]  Tareq Y. Al-Naffouri,et al.  Adaptive Filters with Error Nonlinearities: Mean-Square Analysis and Optimum Design , 2001, EURASIP J. Adv. Signal Process..