Online efficient learning with quantized KLMS and L1 regularization

In a recent work, we have proposed the quantized kernel least mean square (QKLMS) algorithm, which is quite effective in online learning sequentially a nonlinear mapping with a slowly growing radial basis function (RBF) structure. In this paper, in order to further reduce the network size, we propose a sparse QKLMS algorithm, which is derived by adding a sparsity inducing l1 norm penalty of the coefficients to the squared error cost. Simulation examples show that the new algorithm works efficiently, and results in a much sparser network while preserving a desirable performance.

[1]  Xiaohong Jiang,et al.  Generalized Two-Hop Relay for Flexible Delay Control in MANETs , 2012, IEEE/ACM Transactions on Networking.

[2]  Weifeng Liu,et al.  An Information Theoretic Approach of Designing Sparse Kernel Adaptive Filters , 2009, IEEE Transactions on Neural Networks.

[3]  S. Haykin,et al.  Kernel Least‐Mean‐Square Algorithm , 2010 .

[4]  Badong Chen,et al.  Quantized Kernel Least Mean Square Algorithm , 2012, IEEE Transactions on Neural Networks and Learning Systems.

[5]  John C. Platt A Resource-Allocating Network for Function Interpolation , 1991, Neural Computation.

[6]  Yuantao Gu,et al.  $l_{0}$ Norm Constraint LMS Algorithm for Sparse System Identification , 2009, IEEE Signal Processing Letters.

[7]  Shie Mannor,et al.  The kernel recursive least-squares algorithm , 2004, IEEE Transactions on Signal Processing.

[8]  Ender M. Eksioglu,et al.  Sparsity regularised recursive least squares adaptive filtering , 2011 .

[9]  Alfred O. Hero,et al.  Sparse LMS for system identification , 2009, 2009 IEEE International Conference on Acoustics, Speech and Signal Processing.

[10]  Weifeng Liu,et al.  Kernel Adaptive Filtering , 2010 .

[11]  Weifeng Liu,et al.  Kernel Adaptive Filtering: A Comprehensive Introduction , 2010 .