A dictionary based survival error compensation for robust adaptive filtering

Survival information potential (SIP) is defined by the survival distribution function instead of the probability density function (PDF) of a random variable. SIP can be used as a risk function equipped with learning error compensation ability while this SIP based risk function does not involve the estimation of PDF. This is desirable for a robust learning application in view of the error compensation ability. The learning error compensation scheme provided by SIP requires rank information of learning errors. The accuracy of error compensation desires a large number of input data but is computationally expensive. It is shown that the error compensation can be approximated by an error-related distribution. Based on this approximation, a dictionary based error compensation scheme is proposed to obtain a fixed-budget recursive online learning method. This proposed method is compared with several well-known online learning methods including least-mean-square method, least absolute deviation method, affine projection algorithm, recursive least-mean-square method, and sliding window based SIP method. Simulation results validate the outstanding smooth and consistent convergence performance of the proposed method particularly in α-stable-noise environments.

[1]  Kostas Zografos,et al.  Survival exponential entropies , 2005, IEEE Transactions on Information Theory.

[2]  Imre Csiszár,et al.  Information Theory and Statistics: A Tutorial , 2004, Found. Trends Commun. Inf. Theory.

[3]  Ali H. Sayed,et al.  Mean-square performance of a family of affine projection algorithms , 2004, IEEE Transactions on Signal Processing.

[4]  José Carlos Príncipe,et al.  The C-loss function for pattern classification , 2014, Pattern Recognit..

[5]  Yunmei Chen,et al.  Cumulative residual entropy: a new measure of information , 2004, IEEE Transactions on Information Theory.

[6]  Paulo Sergio Ramirez,et al.  Fundamentals of Adaptive Filtering , 2002 .

[7]  Inderjit S. Dhillon,et al.  Information-theoretic metric learning , 2006, ICML '07.

[8]  Badong Chen,et al.  Quantized Kernel Recursive Least Squares Algorithm , 2013, IEEE Transactions on Neural Networks and Learning Systems.

[9]  Badong Chen,et al.  Quantized Kernel Least Mean Square Algorithm , 2012, IEEE Transactions on Neural Networks and Learning Systems.

[10]  C. L. Nikias,et al.  Signal processing with fractional lower order moments: stable processes and their applications , 1993, Proc. IEEE.

[11]  Luís A. Alexandre,et al.  Minimum Error Entropy Classification , 2013, Studies in Computational Intelligence.

[12]  S. Haykin,et al.  Adaptive Filter Theory , 1986 .

[13]  E. Parzen On Estimation of a Probability Density Function and Mode , 1962 .

[14]  Badong Chen,et al.  Survival Information Potential: A New Criterion for Adaptive System Training , 2012, IEEE Transactions on Signal Processing.

[15]  Weifeng Liu,et al.  Kernel Adaptive Filtering: A Comprehensive Introduction , 2010 .

[16]  Jose C. Principe,et al.  Information Theoretic Learning - Renyi's Entropy and Kernel Perspectives , 2010, Information Theoretic Learning.

[17]  B. Widrow,et al.  Adaptive noise cancelling: Principles and applications , 1975 .

[18]  Badong Chen,et al.  System Parameter Identification: Information Criteria and Algorithms , 2013 .

[19]  Radford M. Neal Pattern Recognition and Machine Learning , 2007, Technometrics.

[20]  Badong Chen,et al.  Empirical survival error potential weighted least squares for binary pattern classification , 2014, 2014 13th International Conference on Control Automation Robotics & Vision (ICARCV).

[21]  G. Crooks On Measures of Entropy and Information , 2015 .

[22]  Weifeng Liu,et al.  Kernel Adaptive Filtering , 2010 .

[23]  Badong Chen,et al.  Mean square convergence analysis for kernel least mean square algorithm , 2012, Signal Process..

[24]  Badong Chen,et al.  Sequential extreme learning machine incorporating survival error potential , 2015, Neurocomputing.