Dynamic ensemble extreme learning machine based on sample entropy

Extreme learning machine (ELM) as a new learning algorithm has been proposed for single-hidden layer feed-forward neural networks, ELM can overcome many drawbacks in the traditional gradient-based learning algorithm such as local minimal, improper learning rate, and low learning speed by randomly selecting input weights and hidden layer bias. However, ELM suffers from instability and over-fitting, especially on large datasets. In this paper, a dynamic ensemble extreme learning machine based on sample entropy is proposed, which can alleviate to some extent the problems of instability and over-fitting, and increase the prediction accuracy. The experimental results show that the proposed approach is robust and efficient.

[1]  Chee Kheong Siew,et al.  Universal Approximation using Incremental Constructive Feedforward Networks with Random Hidden Nodes , 2006, IEEE Transactions on Neural Networks.

[2]  Han Wang,et al.  Ensemble Based Extreme Learning Machine , 2010, IEEE Signal Processing Letters.

[3]  Galina L. Rogova,et al.  Combining the results of several neural network classifiers , 1994, Neural Networks.

[4]  Xizhao Wang,et al.  Upper integral network with extreme learning mechanism , 2011, Neurocomputing.

[5]  Shuiping Gou,et al.  Greedy optimization classifiers ensemble based on diversity , 2011, Pattern Recognit..

[6]  Feilong Cao,et al.  A study on effectiveness of extreme learning machine , 2011, Neurocomputing.

[7]  Korris Fu-Lai Chung,et al.  Positive and negative fuzzy rule system, extreme learning machine and image classification , 2011, Int. J. Mach. Learn. Cybern..

[8]  Lars Kai Hansen,et al.  Neural Network Ensembles , 1990, IEEE Trans. Pattern Anal. Mach. Intell..

[9]  Sankar K. Pal,et al.  Pattern Recognition: From Classical to Modern Approaches , 2001 .

[10]  Xizhao Wang,et al.  Induction of multiple fuzzy decision trees based on rough set technique , 2008, Inf. Sci..

[11]  Nojun Kwak,et al.  Feature extraction for classification problems and its application to face recognition , 2008, Pattern Recognit..

[12]  Fabio Roli,et al.  Multiple classifier systems for robust classifier design in adversarial environments , 2010, Int. J. Mach. Learn. Cybern..

[13]  Janez Demsar,et al.  Statistical Comparisons of Classifiers over Multiple Data Sets , 2006, J. Mach. Learn. Res..

[14]  Narasimhan Sundararajan,et al.  A Fast and Accurate Online Sequential Learning Algorithm for Feedforward Networks , 2006, IEEE Transactions on Neural Networks.

[15]  Q. M. Jonathan Wu,et al.  Human face recognition based on multidimensional PCA and extreme learning machine , 2011, Pattern Recognit..

[16]  Yoav Freund,et al.  A decision-theoretic generalization of on-line learning and an application to boosting , 1995, EuroCOLT.

[17]  Wei Tang,et al.  Ensembling neural networks: Many could be better than all , 2002, Artif. Intell..

[18]  Xin Yao,et al.  Diversity creation methods: a survey and categorisation , 2004, Inf. Fusion.

[19]  Robert K. L. Gay,et al.  Error Minimized Extreme Learning Machine With Growth of Hidden Nodes and Incremental Learning , 2009, IEEE Transactions on Neural Networks.

[20]  Philip S. Yu,et al.  Top 10 algorithms in data mining , 2007, Knowledge and Information Systems.

[21]  Guang-Bin Huang,et al.  Convex Incremental Extreme Learning Machine , 2007 .

[22]  Chee Kheong Siew,et al.  Extreme learning machine: Theory and applications , 2006, Neurocomputing.

[23]  Hongming Zhou,et al.  Optimization method based extreme learning machine for classification , 2010, Neurocomputing.

[24]  Dianhui Wang,et al.  Extreme learning machines: a survey , 2011, Int. J. Mach. Learn. Cybern..

[25]  Xi-Zhao Wang,et al.  Improving Generalization of Fuzzy IF--THEN Rules by Maximizing Fuzzy Entropy , 2009, IEEE Transactions on Fuzzy Systems.

[26]  Ping Li,et al.  Dynamic Adaboost ensemble extreme learning machine , 2010, 2010 3rd International Conference on Advanced Computer Theory and Engineering(ICACTE).

[27]  L. Kuncheva,et al.  Combining classifiers: Soft computing solutions. , 2001 .

[28]  Xizhao Wang,et al.  Maximum Ambiguity-Based Sample Selection in Fuzzy Decision Tree Induction , 2012, IEEE Transactions on Knowledge and Data Engineering.

[29]  Robert E. Schapire,et al.  The strength of weak learnability , 1990, Mach. Learn..

[30]  Ludmila I. Kuncheva,et al.  Measures of Diversity in Classifier Ensembles and Their Relationship with the Ensemble Accuracy , 2003, Machine Learning.

[31]  Kevin W. Bowyer,et al.  Combination of Multiple Classifiers Using Local Accuracy Estimates , 1997, IEEE Trans. Pattern Anal. Mach. Intell..

[32]  Binu P. Chacko,et al.  Handwritten character recognition using wavelet energy and extreme learning machine , 2012, Int. J. Mach. Learn. Cybern..

[33]  Matthias Baumgarten,et al.  Optimal model selection for posture recognition in home-based healthcare , 2011, Int. J. Mach. Learn. Cybern..

[34]  Antonio J. Serrano,et al.  BELM: Bayesian Extreme Learning Machine , 2011, IEEE Transactions on Neural Networks.

[35]  Robert Sabourin,et al.  From dynamic classifier selection to dynamic ensemble selection , 2008, Pattern Recognit..

[36]  Leo Breiman,et al.  Bagging Predictors , 1996, Machine Learning.

[37]  D. Serre Matrices: Theory and Applications , 2002 .

[38]  Jiri Matas,et al.  On Combining Classifiers , 1998, IEEE Trans. Pattern Anal. Mach. Intell..

[39]  José David Martín-Guerrero,et al.  Regularized extreme learning machine for regression problems , 2011, Neurocomputing.