A stochastic behavior analysis of stochastic restricted-gradient descent algorithm in reproducing kernel hilbert spaces

This paper presents a stochastic behavior analysis of a kernel-based stochastic restricted-gradient descent method. The restricted gradient gives a steepest ascent direction within the so-called dictionary subspace. The analysis provides the transient and steady state performance in the mean squared error criterion. It also includes stability conditions in the mean and mean-square sense. The present study is based on the analysis of the kernel normalized least mean square (KNLMS) algorithm initially proposed by Chen et al. Simulation results validate the analysis.

[1]  Alexander J. Smola,et al.  Online learning with kernels , 2001, IEEE Transactions on Signal Processing.

[2]  Masahiro Yukawa,et al.  An efficient kernel adaptive filtering algorithm using hyperplane projection along affine subspace , 2012, 2012 Proceedings of the 20th European Signal Processing Conference (EUSIPCO).

[3]  Cédric Richard,et al.  Stochastic Behavior Analysis of the Gaussian Kernel Least-Mean-Square Algorithm , 2012, IEEE Transactions on Signal Processing.

[4]  John C. Platt A Resource-Allocating Network for Function Interpolation , 1991, Neural Computation.

[5]  Weifeng Liu,et al.  Kernel Adaptive Filtering , 2010 .

[6]  Paul Honeine,et al.  Online Prediction of Time Series Data With Kernels , 2009, IEEE Trans. Signal Process..

[7]  Miguel Lázaro-Gredilla,et al.  Kernel Recursive Least-Squares Tracker for Time-Varying Regression , 2012, IEEE Transactions on Neural Networks and Learning Systems.

[8]  S. Haykin Adaptive Filters , 2007 .

[9]  Masahiro Yukawa,et al.  An efficient sparse kernel adaptive filtering algorithm based on isomorphism between functional subspace and Euclidean space , 2014, 2014 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[10]  Cédric Richard,et al.  Closed-form conditions for convergence of the Gaussian kernel-least-mean-square algorithm , 2012, 2012 Conference Record of the Forty Sixth Asilomar Conference on Signals, Systems and Computers (ASILOMAR).

[11]  H. Al. Duwaish,et al.  Use of Multilayer Feedforward Neural Networks in Identification and Control of Wiener Model , 1996 .

[12]  Badong Chen,et al.  Quantized Kernel Least Mean Square Algorithm , 2012, IEEE Transactions on Neural Networks and Learning Systems.

[13]  Cédric Richard,et al.  Stochastic Behavior Analysis of the Gaussian Kernel Least-Mean-Square Algorithm , 2012, IEEE Trans. Signal Process..

[14]  S. Haykin,et al.  Kernel Least‐Mean‐Square Algorithm , 2010 .

[15]  Badong Chen,et al.  Mean square convergence analysis for kernel least mean square algorithm , 2012, Signal Process..

[16]  Jie Chen,et al.  Convergence analysis of kernel LMS algorithm with pre-tuned dictionary , 2013, 2014 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[17]  Masahiro Yukawa,et al.  Multikernel Adaptive Filtering , 2012, IEEE Transactions on Signal Processing.

[18]  Sergios Theodoridis,et al.  Adaptive Constrained Learning in Reproducing Kernel Hilbert Spaces: The Robust Beamforming Case , 2009, IEEE Transactions on Signal Processing.

[19]  Jie Chen,et al.  Online Dictionary Learning for Kernel LMS , 2014, IEEE Transactions on Signal Processing.

[20]  Masahiro Yukawa,et al.  Adaptive Nonlinear Estimation Based on Parallel Projection Along Affine Subspaces in Reproducing Kernel Hilbert Space , 2015, IEEE Transactions on Signal Processing.

[21]  Shie Mannor,et al.  The kernel recursive least-squares algorithm , 2004, IEEE Transactions on Signal Processing.

[22]  Paul Honeine,et al.  Online Prediction of Time Series Data With Kernels , 2009, IEEE Transactions on Signal Processing.