A Preliminary Study of Diversity in Extreme Learning Machines Ensembles

In this paper, the neural network version of Extreme Learning Machine (ELM) is used as a base learner for an ensemble meta-algorithm which promotes diversity explicitly in the ELM loss function. The cost function proposed encourages orthogonality (scalar product) in the parameter space. Other ensemble-based meta-algorithms from AdaBoost family are used for comparison purposes. Both accuracy and diversity presented in our proposal are competitive, thus reinforcing the idea of introducing diversity explicitly.

[1]  João Gama,et al.  Ensemble learning for data stream analysis: A survey , 2017, Inf. Fusion.

[2]  Thomas G. Dietterich Multiple Classifier Systems , 2000, Lecture Notes in Computer Science.

[3]  Ioannis A. Kakadiaris,et al.  Hierarchical Multi-label Classification using Fully Associative Ensemble Learning , 2017, Pattern Recognit..

[4]  Leo Breiman,et al.  Bagging Predictors , 1996, Machine Learning.

[5]  Leo Breiman,et al.  Bagging Predictors , 1996, Machine Learning.

[6]  Ameet Talwalkar,et al.  Foundations of Machine Learning , 2012, Adaptive computation and machine learning.

[7]  Robi Polikar Ensemble learning , 2009, Scholarpedia.

[8]  Wang Xin,et al.  Boosting ridge extreme learning machine , 2012, 2012 IEEE Symposium on Robotics and Applications (ISRA).

[9]  Nikunj C. Oza,et al.  Online Ensemble Learning , 2000, AAAI/IAAI.

[10]  Hongming Zhou,et al.  Extreme Learning Machine for Regression and Multiclass Classification , 2012, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[11]  Annalisa Riccardi,et al.  Cost-Sensitive AdaBoost Algorithm for Ordinal Regression Based on Extreme Learning Machine , 2014, IEEE Transactions on Cybernetics.

[12]  Chee Kheong Siew,et al.  Extreme learning machine: Theory and applications , 2006, Neurocomputing.

[13]  Bo Meng,et al.  A new modeling method based on bagging ELM for day-ahead electricity price prediction , 2010, 2010 IEEE Fifth International Conference on Bio-Inspired Computing: Theories and Applications (BIC-TA).

[14]  Huanhuan Chen,et al.  Negative correlation learning for classification ensembles , 2010, The 2010 International Joint Conference on Neural Networks (IJCNN).

[15]  Yoav Freund,et al.  A Short Introduction to Boosting , 1999 .

[16]  Ludmila I. Kuncheva,et al.  Measures of Diversity in Classifier Ensembles and Their Relationship with the Ensemble Accuracy , 2003, Machine Learning.

[17]  Yoav Freund,et al.  A decision-theoretic generalization of on-line learning and an application to boosting , 1995, EuroCOLT.

[18]  อนิรุธ สืบสิงห์,et al.  Data Mining Practical Machine Learning Tools and Techniques , 2014 .