Improving Accuracy of the Kalman Filter Algorithm in Dynamic Conditions Using ANN-Based Learning Module

Prediction algorithms enable computers to learn from historical data in order to make accurate decisions about an uncertain future to maximize expected benefit or avoid potential loss. Conventional prediction algorithms are usually based on a trained model, which is learned from historical data. However, the problem with such prediction algorithms is their inability to adapt to dynamic scenarios and changing conditions. This paper presents a novel learning to prediction model to improve the performance of prediction algorithms under dynamic conditions. In the proposed model, a learning module is attached to the prediction algorithm, which acts as a supervisor to monitor and improve the performance of the prediction algorithm continuously by analyzing its output and considering external factors that may have an influence on its performance. To evaluate the effectiveness of the proposed learning to prediction model, we have developed the artificial neural network (ANN)-based learning module to improve the prediction accuracy of the Kalman filter algorithm as a case study. For experimental analysis, we consider a scenario where the Kalman filter algorithm is used to predict actual temperature from noisy sensor readings. the Kalman filter algorithm uses fixed process error covariance R, which is not suitable for dynamic situations where the error in sensor readings varies due to some external factors. In this study, we assume variable error in temperature sensor readings due to the changing humidity level. We have developed a learning module based on ANN to estimate the amount of error in current readings and to update R in the Kalman filter accordingly. Through experiments, we observed that the Kalman filter with the learning module performed better (4.41%–11.19%) than the conventional Kalman filter algorithm in terms of the root mean squared error metric.

[1]  Kenneth Strzepek,et al.  A technique for generating regional climate scenarios using a nearest‐neighbor algorithm , 2003 .

[2]  Zhi-Hua Zhou,et al.  A k-nearest neighbor based algorithm for multi-label classification , 2005, 2005 IEEE International Conference on Granular Computing.

[3]  Shan Suthaharan,et al.  Decision Tree Learning , 2016 .

[4]  David H. Wolpert,et al.  Stacked generalization , 1992, Neural Networks.

[5]  Jingyi Huang,et al.  Monitoring and modelling soil water dynamics using electromagnetic conductivity imaging and the ensemble Kalman filter , 2017 .

[6]  Marcin Wozniak,et al.  Adaptive neuro-heuristic hybrid model for fruit peel defects detection , 2018, Neural Networks.

[7]  Mary Slocum DECISION MAKING USING ID3 ALGORITHM , 2012 .

[8]  Ananth Ranganathan,et al.  The Levenberg-Marquardt Algorithm , 2004 .

[9]  Juan Manuel Ramírez-Cortés,et al.  Attitude estimation using a Neuro-Fuzzy tuning based adaptive Kalman filter , 2015, J. Intell. Fuzzy Syst..

[10]  Laura Balzer,et al.  Stacked Generalization: An Introduction to Super Learning , 2017 .

[11]  Prudence W. H. Wong,et al.  Hierarchical Meta-Learning in Time Series Forecasting for Improved Interference-Less Machine Learning , 2017, Symmetry.

[12]  Marcin Wozniak,et al.  An Intelligent System for Monitoring Skin Diseases , 2018, Sensors.

[13]  Bernhard Schölkopf,et al.  A tutorial on support vector regression , 2004, Stat. Comput..

[14]  Fi-John Chang,et al.  Adaptive neuro-fuzzy inference system for prediction of water level in reservoir , 2006 .

[15]  Christos S. Ioakimidis,et al.  Simulation of Wind-Battery Microgrid Based on Short-Term Wind Power Forecasting , 2017 .

[16]  I. Maqsood,et al.  Random Forests and Decision Trees , 2012 .

[17]  Mason Carpenter,et al.  Principles of Management Version , 2010 .

[18]  Chan Gook Park,et al.  Attitude estimation with accelerometers and gyros using fuzzy tuned Kalman filter , 2009, 2009 European Control Conference (ECC).

[19]  T. Başar,et al.  A New Approach to Linear Filtering and Prediction Problems , 2001 .

[20]  Rashmi Agrawal,et al.  Comparative Analysis of Decision Tree Algorithms , 2018 .

[21]  Richard J. Povinelli,et al.  Short-Term Load Forecasting of Natural Gas with Deep Neural Network Regression † , 2018, Energies.

[22]  Andrew W. Senior,et al.  Long short-term memory recurrent neural network architectures for large scale acoustic modeling , 2014, INTERSPEECH.

[23]  Shane Legg,et al.  Human-level control through deep reinforcement learning , 2015, Nature.

[24]  Yoon Kim,et al.  Convolutional Neural Networks for Sentence Classification , 2014, EMNLP.

[25]  W.J. Tompkins,et al.  A patient-adaptable ECG beat classifier using a mixture of experts approach , 1997, IEEE Transactions on Biomedical Engineering.

[26]  Peter Norvig,et al.  Artificial Intelligence: A Modern Approach , 1995 .

[27]  Xiang Zhang,et al.  Text Understanding from Scratch , 2015, ArXiv.

[28]  Philip Hans Franses,et al.  Evaluating chi-squared automatic interaction detection , 2006, Inf. Syst..

[29]  Prabhat,et al.  Artificial Neural Network , 2018, Encyclopedia of GIS.

[30]  Leo Breiman,et al.  Classification and Regression Trees , 1984 .

[31]  Jun Zhao,et al.  Recurrent Convolutional Neural Networks for Text Classification , 2015, AAAI.

[32]  Michael Y. Hu,et al.  Forecasting with artificial neural networks: The state of the art , 1997 .

[33]  Demis Hassabis,et al.  Mastering the game of Go with deep neural networks and tree search , 2016, Nature.

[34]  Nurdan Akhan Baykan,et al.  A MINERAL CLASSIFICATION SYSTEM WITH MULTIPLE ARTIFICIAL NEURAL NETWORK USING K-FOLD CROSS VALIDATION , 2011 .

[35]  Jeffrey K. Uhlmann,et al.  New extension of the Kalman filter to nonlinear systems , 1997, Defense, Security, and Sensing.

[36]  R. Brereton,et al.  Support vector machines for classification and regression. , 2010, The Analyst.

[37]  Wei Tang,et al.  Ensembling neural networks: Many could be better than all , 2002, Artif. Intell..

[38]  Jidong Lv,et al.  Adaptive-Gain Regulation of Extended Kalman Filter for Use in Inertial and Magnetic Units Based on Hidden Markov Model , 2018, IEEE Sensors Journal.

[39]  Jean-Charles Pomerol,et al.  Artificial intelligence and human decision making , 1997 .

[40]  Shan Suthaharan,et al.  Machine Learning Models and Algorithms for Big Data Classification , 2016 .

[41]  Weisi Lin,et al.  Fundamental Knowledge of Machine Learning , 2015 .

[42]  Peng Shi,et al.  Fusion Kalman/UFIR Filter for State Estimation With Uncertain Parameters and Noise Statistics , 2017, IEEE Transactions on Industrial Electronics.

[43]  Upmanu Lall,et al.  A k‐nearest‐neighbor simulator for daily precipitation and other weather variables , 1999 .

[44]  Robert A. Jacobs,et al.  Methods For Combining Experts' Probability Assessments , 1995, Neural Computation.

[45]  Cardona Alzate,et al.  Predicción y selección de variables con bosques aleatorios en presencia de variables correlacionadas , 2020 .

[46]  O. Straka,et al.  Performance evaluation of iterated extended Kalman filter with variable step-length , 2015 .

[47]  Leo Breiman,et al.  Random Forests , 2001, Machine Learning.