Hidden state estimation using the Correntropy Filter with fixed point update and adaptive kernel size

In this paper we review the Correntropy Filter for hidden state estimation and we introduce the fixed point update rule for the Correntropy Filter instead of using gradient ascent for faster convergence. We further propose an adaptive kernel bandwidth selection algorithm. It is shown that the new filter outperforms the Kalman Filter and has no free parameters. The algorithm's capabilities are demonstrated on a simulated experiment and a vehicle tracking problem.

[1]  James Ferryman,et al.  Proceedings of the thirteenth IEEE International Workshop on Performance Evaluation of Tracking and Surveillance , 2009 .

[2]  Jose C. Principe,et al.  Information Theoretic Learning - Renyi's Entropy and Kernel Perspectives , 2010, Information Theoretic Learning.

[3]  J. Meditch,et al.  Applied optimal control , 1972, IEEE Transactions on Automatic Control.

[4]  Badong Chen,et al.  Extended Kalman filter using a kernel recursive least squares observer , 2011, The 2011 International Joint Conference on Neural Networks.

[5]  José Carlos Príncipe,et al.  Adaptive background estimation using an information theoretic cost for hidden state estimation , 2011, The 2011 International Joint Conference on Neural Networks.