Stochastic Behavior Analysis of the Gaussian Kernel Least-Mean-Square Algorithm

The kernel least-mean-square (KLMS) algorithm is a popular algorithm in nonlinear adaptive filtering due to its simplicity and robustness. In kernel adaptive filters, the statistics of the input to the linear filter depends on the parameters of the kernel employed. Moreover, practical implementations require a finite nonlinearity model order. A Gaussian KLMS has two design parameters, the step size and the Gaussian kernel bandwidth. Thus, its design requires analytical models for the algorithm behavior as a function of these two parameters. This paper studies the steady-state behavior and the transient behavior of the Gaussian KLMS algorithm for Gaussian inputs and a finite order nonlinearity model. In particular, we derive recursive expressions for the mean-weight-error vector and the mean-square-error. The model predictions show excellent agreement with Monte Carlo simulations in transient and steady state. This allows the explicit analytical determination of stability limits, and gives opportunity to choose the algorithm parameters a priori in order to achieve prescribed convergence speed and quality of the estimate. Design examples are presented which validate the theoretical analysis and illustrates its application.

[1]  Paul Honeine,et al.  On-line Nonlinear Sparse Approximation of Functions , 2007, 2007 IEEE International Symposium on Information Theory.

[2]  P. Honeine,et al.  Solving the pre-image problem in kernel machines: A direct method , 2009, 2009 IEEE International Workshop on Machine Learning for Signal Processing.

[3]  Tokunbo Ogunfunmi,et al.  Adaptive Nonlinear System Identification , 2007 .

[4]  Ali H. Sayed,et al.  Fundamentals Of Adaptive Filtering , 2003 .

[5]  Shie Mannor,et al.  The kernel recursive least-squares algorithm , 2004, IEEE Transactions on Signal Processing.

[6]  Gene H. Golub,et al.  Matrix computations , 1983 .

[7]  J. Minkoff,et al.  Comment on the "Unnecessary assumption of statistical independence between reference signal and filter weights in feedforward adaptive systems" , 2001, IEEE Trans. Signal Process..

[8]  Thomas Kailath,et al.  RKHS approach to detection and estimation problems-V: Parameter estimation , 1973, IEEE Trans. Inf. Theory.

[9]  Sergios Theodoridis,et al.  Ieee Transactions on Signal Processing Extension of Wirtinger's Calculus to Reproducing Kernel Hilbert Spaces and the Complex Kernel Lms , 2022 .

[10]  N. Aronszajn Theory of Reproducing Kernels. , 1950 .

[11]  Y. Engel Kernel Recursive Least Squares , 2004 .

[12]  Yu-Liang Hsu,et al.  Dynamic Nonlinear System Identification Using a Wiener-Type Recurrent Network with OKID Algorithm , 2008, J. Inf. Sci. Eng..

[13]  Weifeng Liu,et al.  The Kernel Least-Mean-Square Algorithm , 2008, IEEE Transactions on Signal Processing.

[14]  T. Kailath,et al.  SOME USEFUL PROBABILITY DISTRIBUTIONS , 1965 .

[15]  M. Schetzen The Volterra and Wiener Theories of Nonlinear Systems , 1980 .

[16]  G. Grimmett,et al.  Probability and random processes , 2002 .

[17]  H. Al. Duwaish,et al.  Use of Multilayer Feedforward Neural Networks in Identification and Control of Wiener Model , 1996 .

[18]  S. Haykin,et al.  Adaptive Filter Theory , 1986 .

[19]  Tokunbo Ogunfunmi,et al.  Adaptive Nonlinear System Identification: The Volterra and Wiener Model Approaches , 2007 .

[20]  Jozef Vörös,et al.  Modeling and identification of Wiener systems with two-segment nonlinearities , 2003, IEEE Trans. Control. Syst. Technol..

[21]  Sergios Theodoridis,et al.  Sliding Window Generalized Kernel Affine Projection Algorithm Using Projection Mappings , 2008, EURASIP J. Adv. Signal Process..

[22]  Weifeng Liu,et al.  Kernel Adaptive Filtering , 2010 .

[23]  Kumpati S. Narendra,et al.  Identification and control of dynamical systems using neural networks , 1990, IEEE Trans. Neural Networks.

[24]  Paul Honeine,et al.  Online Prediction of Time Series Data With Kernels , 2009, IEEE Transactions on Signal Processing.

[25]  N. Wiener,et al.  Nonlinear Problems in Random Theory , 1964 .

[26]  Danilo P. Mandic,et al.  A generalized normalized gradient descent algorithm , 2004, IEEE Signal Processing Letters.

[27]  G. Wahba,et al.  Some results on Tchebycheffian spline functions , 1971 .