Hybrid affine projection algorithm

In this work, we put forward a new adaptation criterion, namely the hybrid criterion (HC), which is a mixture of the traditional mean square error (MSE) and the maximum correntropy criterion (MCC). The HC criterion is developed from the viewpoint of the least trimmed squares (LTS) estimator, a high breakdown estimator that can avoid undue influence from outliers. In the LTS estimator, the data are divided (by ranking) into two categories: the normal data and the outliers, and the outlier data are purely discarded. In order to improve the robustness of the LTS, some data with large values, which may contain some useful information, are also thrown away. Instead of purely throwing away those data, the new criterion applies the robust MCC criterion on the large data, and hence can efficiently utilize them to further improve the performance. We apply the HC criterion to adaptive filtering and develop the hybrid affine projection algorithm (HAPA) and kernel hybrid affine projection algorithm (KHAPA). Simulation results show that the proposed algorithms perform very well.

[1]  Shie Mannor,et al.  The kernel recursive least-squares algorithm , 2004, IEEE Transactions on Signal Processing.

[2]  José Carlos Príncipe,et al.  Using Correntropy as a cost function in linear adaptive filters , 2009, 2009 International Joint Conference on Neural Networks.

[3]  Ali H. Sayed,et al.  Variable step-size NLMS and affine projection algorithms , 2004, IEEE Signal Processing Letters.

[4]  Weifeng Liu,et al.  Kernel Adaptive Filtering: A Comprehensive Introduction , 2010 .

[5]  Andrzej Rusiecki,et al.  Robust learning algorithm based on LTA estimator , 2013, Neurocomputing.

[6]  C. L. Nikias,et al.  Signal processing with alpha-stable distributions and applications , 1995 .

[7]  Badong Chen,et al.  Quantized Kernel Least Mean Square Algorithm , 2012, IEEE Transactions on Neural Networks and Learning Systems.

[8]  Weifeng Liu,et al.  Correntropy: Properties and Applications in Non-Gaussian Signal Processing , 2007, IEEE Transactions on Signal Processing.

[9]  N. Aronszajn Theory of Reproducing Kernels. , 1950 .

[10]  Kazuhiko Ozeki Kernel Affine Projection Algorithm , 2016 .

[11]  Weifeng Liu,et al.  Kernel Affine Projection Algorithms , 2008, EURASIP J. Adv. Signal Process..

[12]  Don H. Johnson,et al.  Statistical Signal Processing , 2009, Encyclopedia of Biometrics.

[13]  S. Haykin,et al.  Kernel Least‐Mean‐Square Algorithm , 2010 .

[14]  Andrzej Rusiecki Robust LTS Backpropagation Learning Algorithm , 2007, IWANN.

[15]  Edward J. Wegman,et al.  Statistical Signal Processing , 1985 .

[17]  W. Steiger,et al.  Least Absolute Deviations: Theory, Applications and Algorithms , 1984 .

[18]  José Carlos Príncipe,et al.  Generalized correlation function: definition, properties, and application to blind equalization , 2006, IEEE Transactions on Signal Processing.

[19]  G. G. Stokes "J." , 1890, The New Yale Book of Quotations.

[20]  Tri-Dung Nguyen,et al.  Outlier detection and least trimmed squares approximation using semi-definite programming , 2010, Comput. Stat. Data Anal..

[21]  Ali H. Sayed,et al.  Mean-square performance of a family of affine projection algorithms , 2004, IEEE Transactions on Signal Processing.

[22]  Paulo Sergio Ramirez,et al.  Fundamentals of Adaptive Filtering , 2002 .

[23]  Weifeng Liu,et al.  The Kernel Least-Mean-Square Algorithm , 2008, IEEE Transactions on Signal Processing.

[24]  Weifeng Liu,et al.  Kernel Adaptive Filtering , 2010 .

[25]  Nanning Zheng,et al.  Trimmed affine projection algorithms , 2014, 2014 International Joint Conference on Neural Networks (IJCNN).