Outliers Elimination for Error Correction Algorithm Improvement

Neural networks are still very important part of artificial intelligence. RBF networks seems to be more powerfull than that based on sigmoid function. Error Correction is second order training algorithm dedicated for RBF networks. The paper proposes method for improvement this algorithm by elimination of inconsistent patterns. The approach is also experimentally confirmed.

[1]  Hao Yu,et al.  Selection of Proper Neural Network Sizes and Architectures—A Comparative Study , 2012, IEEE Transactions on Industrial Informatics.

[2]  Luca Maria Gambardella,et al.  Deep Big Simple Neural Nets Excel on Handwritten Digit Recognition , 2010, ArXiv.

[3]  Luca Maria Gambardella,et al.  Deep, Big, Simple Neural Nets for Handwritten Digit Recognition , 2010, Neural Computation.

[4]  Geoffrey E. Hinton,et al.  Learning representations by back-propagating errors , 1986, Nature.

[5]  K. Lang,et al.  Learning to tell two spirals apart , 1988 .

[6]  Hao Yu,et al.  Neural Network Learning Without Backpropagation , 2010, IEEE Transactions on Neural Networks.

[7]  Bogdan M. Wilamowski,et al.  Advantage analysis of sigmoid based RBF networks , 2013, 2013 IEEE 17th International Conference on Intelligent Engineering Systems (INES).

[8]  Hao Yu,et al.  Fast and Efficient Second-Order Method for Training Radial Basis Function Networks , 2012, IEEE Transactions on Neural Networks and Learning Systems.

[9]  Bogdan M. Wilamowski,et al.  Implementation of RBF type networks by MLP networks , 1996, Proceedings of International Conference on Neural Networks (ICNN'96).

[10]  Vladimir Vapnik,et al.  Statistical learning theory , 1998 .

[11]  Geoffrey E. Hinton,et al.  Phone Recognition with the Mean-Covariance Restricted Boltzmann Machine , 2010, NIPS.

[12]  Bernhard Schölkopf,et al.  A tutorial on support vector regression , 2004, Stat. Comput..

[13]  Christian Lebiere,et al.  The Cascade-Correlation Learning Architecture , 1989, NIPS.

[14]  Geoffrey E. Hinton,et al.  Learning representations by back-propagation errors, nature , 1986 .

[15]  Yoshua. Bengio,et al.  Learning Deep Architectures for AI , 2007, Found. Trends Mach. Learn..

[16]  Hao Yu,et al.  An Incremental Design of Radial Basis Function Networks , 2014, IEEE Transactions on Neural Networks and Learning Systems.

[17]  Chee Kheong Siew,et al.  Universal Approximation using Incremental Constructive Feedforward Networks with Random Hidden Nodes , 2006, IEEE Transactions on Neural Networks.

[18]  Bogdan M. Wilamowski,et al.  Challenges in applications of computational intelligence in industrial electronics , 2010, 2010 IEEE International Symposium on Industrial Electronics.

[19]  Pierluigi Siano,et al.  A Novel RBF Training Algorithm for Short-Term Electric Load Forecasting and Comparative Studies , 2015, IEEE Transactions on Industrial Electronics.

[20]  Bogdan M. Wilamowski,et al.  How Not to Be Frustrated with Neural Networks , 2009 .

[21]  Guang-Bin Huang,et al.  Convex incremental extreme learning machine , 2007, Neurocomputing.

[22]  Hao Yu,et al.  Improved Computation for Levenberg–Marquardt Training , 2010, IEEE Transactions on Neural Networks.