Two-Stage Weighted Regularized Extreme Learning Machine for Class Imbalanced Learning

Compared to conventional machine learning techniques, extreme learning machine (ELM) which trains single-hidden-layer feedforward neural networks (SLFNs) shows faster-learning speed and better generalization performances. However, like most representative supervised learning algorithms, ELM tends to produce biased decision models when datasets are imbalanced. In this paper, two-stage weighted regularized ELM is proposed to address the aforementioned issue. The original regularized ELM (RELM) was proposed to handle adverse effects of outliers but not target the imbalanced learning problem. So we proposed a new weighted regularized ELM (WRELM) for class imbalance learning (CIL) in the first stage. Different from the existing weighted ELM which only considers the class distribution of datasets, the proposed algorithm also puts more focus on hard, misclassified samples in the second stage. The focal loss function is adopted to update weight by decreasing the weight of well-classified samples to focus more attention on the error of difficult samples. The final decision target is determined by the winner-take-all method. We assess the proposed method on 25 binary datasets and 10 multiclass datasets by 5-folder cross validations. The results indicate the proposed algorithm is an efficient method for CIL and exceed other CIL algorithms based on ELM.

[1]  Haibo He,et al.  Learning from Imbalanced Data , 2009, IEEE Transactions on Knowledge and Data Engineering.

[2]  Yiqiang Chen,et al.  Weighted extreme learning machine for imbalance learning , 2013, Neurocomputing.

[3]  Gerald Schaefer,et al.  An improved ensemble approach for imbalanced classification problems , 2013, 2013 IEEE 8th International Symposium on Applied Computational Intelligence and Informatics (SACI).

[4]  Haibo He,et al.  ADASYN: Adaptive synthetic sampling approach for imbalanced learning , 2008, 2008 IEEE International Joint Conference on Neural Networks (IEEE World Congress on Computational Intelligence).

[5]  Debahuti Mishra,et al.  Handling Imbalanced Data: A Survey , 2018 .

[6]  Kaiming He,et al.  Focal Loss for Dense Object Detection , 2017, 2017 IEEE International Conference on Computer Vision (ICCV).

[7]  Jing Wang,et al.  NBWELM: naive Bayesian based weighted extreme learning machine , 2018, Int. J. Mach. Learn. Cybern..

[8]  Paul Jen-Hwa Hu,et al.  A Data-Driven Approach to Manage the Length of Stay for Appendectomy Patients , 2009, IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans.

[9]  Hongming Zhou,et al.  Extreme Learning Machine for Regression and Multiclass Classification , 2012, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[10]  Zhi-Hua Zhou,et al.  Ieee Transactions on Knowledge and Data Engineering 1 Training Cost-sensitive Neural Networks with Methods Addressing the Class Imbalance Problem , 2022 .

[11]  R. Bharat Rao,et al.  Data mining for improved cardiac care , 2006, SKDD.

[12]  Xizhao Wang,et al.  Advances in neural network based learning , 2014, Int. J. Mach. Learn. Cybern..

[13]  Qinghua Zheng,et al.  Regularized Extreme Learning Machine , 2009, 2009 IEEE Symposium on Computational Intelligence and Data Mining.

[14]  Nitesh V. Chawla,et al.  SMOTE: Synthetic Minority Over-sampling Technique , 2002, J. Artif. Intell. Res..

[15]  Changyin Sun,et al.  AL-ELM: One uncertainty-based active learning algorithm using extreme learning machine , 2015, Neurocomputing.

[16]  Bruce W. Suter,et al.  The multilayer perceptron as an approximation to a Bayes optimal discriminant function , 1990, IEEE Trans. Neural Networks.

[17]  Zhi-Hua Zhou,et al.  Exploratory Under-Sampling for Class-Imbalance Learning , 2006, Sixth International Conference on Data Mining (ICDM'06).

[18]  Ekrem Duman,et al.  A profit-driven Artificial Neural Network (ANN) with applications to fraud detection and direct marketing , 2016, Neurocomputing.

[19]  Honghua Dai,et al.  Parameter Estimation of One-Class SVM on Imbalance Text Classification , 2006, Canadian Conference on AI.

[20]  Jianping Yin,et al.  Boosting weighted ELM for imbalanced learning , 2014, Neurocomputing.

[21]  Janez Demsar,et al.  Statistical Comparisons of Classifiers over Multiple Data Sets , 2006, J. Mach. Learn. Res..

[22]  Chee Kheong Siew,et al.  Extreme learning machine: Theory and applications , 2006, Neurocomputing.