Improving Classification Performance through an Advanced Ensemble Based Heterogeneous Extreme Learning Machines

Extreme Learning Machine (ELM) is a fast-learning algorithm for a single-hidden layer feedforward neural network (SLFN). It often has good generalization performance. However, there are chances that it might overfit the training data due to having more hidden nodes than needed. To address the generalization performance, we use a heterogeneous ensemble approach. We propose an Advanced ELM Ensemble (AELME) for classification, which includes Regularized-ELM, L2-norm-optimized ELM (ELML2), and Kernel-ELM. The ensemble is constructed by training a randomly chosen ELM classifier on a subset of training data selected through random resampling. The proposed AELM-Ensemble is evolved by employing an objective function of increasing diversity and accuracy among the final ensemble. Finally, the class label of unseen data is predicted using majority vote approach. Splitting the training data into subsets and incorporation of heterogeneous ELM classifiers result in higher prediction accuracy, better generalization, and a lower number of base classifiers, as compared to other models (Adaboost, Bagging, Dynamic ELM ensemble, data splitting ELM ensemble, and ELM ensemble). The validity of AELME is confirmed through classification on several real-world benchmark datasets.

[1]  Xiong Luo,et al.  A kernel machine-based secure data sensing and fusion scheme in wireless sensor networks for the cyber-physical systems , 2016, Future Gener. Comput. Syst..

[2]  Tuwe Löfström,et al.  On Effectively Creating Ensembles of Classifiers: Studies on Creation Strategies, Diversity and Predicting with Confidence , 2015 .

[3]  Yuan Lan,et al.  Ensemble of online sequential extreme learning machine , 2009, Neurocomputing.

[4]  Ethem Alpaydin Combining Pattern Classifiers: Methods and Algorithms , 2004 .

[5]  Xiaohui Chang,et al.  A large-scale web QoS prediction scheme for the Industrial Internet of Things based on a kernel machine learning algorithm , 2016, Comput. Networks.

[6]  Changhua Liu,et al.  Enhancement of ELM by Clustering Discrimination Manifold Regularization and Multiobjective FOA for Semisupervised Classification , 2015, Comput. Intell. Neurosci..

[7]  Manuel Graña,et al.  Spatially regularized semisupervised Ensembles of Extreme Learning Machines for hyperspectral image segmentation , 2015, Neurocomputing.

[8]  Yoav Freund,et al.  Experiments with a New Boosting Algorithm , 1996, ICML.

[9]  Ludmila I. Kuncheva,et al.  Measures of Diversity in Classifier Ensembles and Their Relationship with the Ensemble Accuracy , 2003, Machine Learning.

[10]  Xiong Luo,et al.  Regression and classification using extreme learning machine based on L1-norm and L2-norm , 2016, Neurocomputing.

[11]  Yoav Freund,et al.  A Short Introduction to Boosting , 1999 .

[12]  Hongming Zhou,et al.  Optimization method based extreme learning machine for classification , 2010, Neurocomputing.

[13]  Francisco Herrera,et al.  Advanced nonparametric tests for multiple comparisons in the design of experiments in computational intelligence and data mining: Experimental analysis of power , 2010, Inf. Sci..

[14]  Xiong Luo,et al.  A novel data fusion scheme using grey model and extreme learning machine in wireless sensor networks , 2015 .

[15]  G. Yule On the Association of Attributes in Statistics: With Illustrations from the Material of the Childhood Society, &c , 1900 .

[16]  Janez Demsar,et al.  Statistical Comparisons of Classifiers over Multiple Data Sets , 2006, J. Mach. Learn. Res..

[17]  Guang-Bin Huang,et al.  Extreme learning machine: a new learning scheme of feedforward neural networks , 2004, 2004 IEEE International Joint Conference on Neural Networks (IEEE Cat. No.04CH37541).

[18]  Ping Li,et al.  Dynamic Adaboost ensemble extreme learning machine , 2010, 2010 3rd International Conference on Advanced Computer Theory and Engineering(ICACTE).

[19]  Xin Yao,et al.  An analysis of diversity measures , 2006, Machine Learning.

[20]  Lu Hui-juan,et al.  Tumor Classification Using Extreme Learning Machine Ensemble , 2012 .

[21]  Hongming Zhou,et al.  Extreme Learning Machine for Regression and Multiclass Classification , 2012, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[22]  Han Wang,et al.  Ensemble Based Extreme Learning Machine , 2010, IEEE Signal Processing Letters.

[23]  Nanning Zheng,et al.  Generalized Correntropy for Robust Adaptive Filtering , 2015, IEEE Transactions on Signal Processing.

[24]  T. Little The Oxford Handbook of Quantitative Methods in Psychology, Vol. 1 , 2013 .

[25]  Eric Bauer,et al.  An Empirical Comparison of Voting Classification Algorithms: Bagging, Boosting, and Variants , 1999, Machine Learning.

[26]  Guang-Bin Huang,et al.  An Insight into Extreme Learning Machines: Random Neurons, Random Features and Kernels , 2014, Cognitive Computation.

[27]  Qinghua Zheng,et al.  Regularized Extreme Learning Machine , 2009, 2009 IEEE Symposium on Computational Intelligence and Data Mining.

[28]  Mahmoud Barghash An Effective and Novel Neural Network Ensemble for Shift Pattern Detection in Control Charts , 2015, Comput. Intell. Neurosci..

[29]  Fuzhen Zhuang,et al.  Extreme Learning Machine Ensemble Classifier for Large-Scale Data , 2015 .

[30]  Loris Nanni,et al.  Toward a General-Purpose Heterogeneous Ensemble for Pattern Classification , 2015, Comput. Intell. Neurosci..

[31]  Xi Liu,et al.  > Replace This Line with Your Paper Identification Number (double-click Here to Edit) < , 2022 .

[32]  Ge Yu,et al.  Parallel ensemble of online sequential extreme learning machine based on MapReduce , 2016, Neurocomputing.

[33]  Catherine Blake,et al.  UCI Repository of machine learning databases , 1998 .

[34]  U. Alon,et al.  Broad patterns of gene expression revealed by clustering analysis of tumor and normal colon tissues probed by oligonucleotide arrays. , 1999, Proceedings of the National Academy of Sciences of the United States of America.

[35]  Yulong Yuan,et al.  A Dynamic Generation Approach for Ensemble of Extreme Learning Machines , 2014, ISNN.