A Homogeneous-Heterogeneous Ensemble of Classifiers

In this study, we introduce an ensemble system by combining homogeneous ensemble and heterogeneous ensemble into a single framework. Based on the observation that the projected data is significantly different from the original data as well as each other after using random projections, we construct the homogeneous module by applying random projections on the training data to obtain the new training sets. In the heterogeneous module, several learning algorithms will train on the new training sets to generate the base classifiers. We propose four combining algorithms based on Sum Rule and Majority Vote Rule for the proposed ensemble. Experiments on some popular datasets confirm that the proposed ensemble method is better than several well-known benchmark algorithms proposed framework has great flexibility when applied to real-world applications. The proposed framework has great flexibility when applied to real-world applications by using any techniques that make rich training data for the homogeneous module, as well as using any set of learning algorithms for the heterogeneous module.

[1]  Julian Jang,et al.  Masquerade Attacks Against Security Software Exclusion Lists , 2019, Aust. J. Intell. Inf. Process. Syst..

[2]  Alan Wee-Chung Liew,et al.  A weighted multiple classifier framework based on random projection , 2019, Inf. Sci..

[3]  Gunnar Rätsch,et al.  Totally corrective boosting algorithms that maximize the margin , 2006, ICML.

[4]  James C. Bezdek,et al.  Decision templates for multiple classifier fusion: an experimental comparison , 2001, Pattern Recognit..

[5]  Chun-Xia Zhang,et al.  RotBoost: A technique for combining Rotation Forest and AdaBoost , 2008, Pattern Recognit. Lett..

[6]  Tuyet-Trinh Vu,et al.  An Ensemble System with Random Projection and Dynamic Ensemble Selection , 2018, ACIIDS.

[7]  Alan Wee-Chung Liew,et al.  A novel genetic algorithm approach for simultaneous feature and classifier selection in multi classifier system , 2014, 2014 IEEE Congress on Evolutionary Computation (CEC).

[8]  Witold Pedrycz,et al.  Combining heterogeneous classifiers via granular prototypes , 2018, Appl. Soft Comput..

[9]  Guo Cao,et al.  A novel ensemble method for k-nearest neighbor , 2019, Pattern Recognit..

[10]  Alan Wee-Chung Liew,et al.  Learning from Data Stream Based on Random Projection and Hoeffding Tree Classifier , 2017, 2017 International Conference on Digital Image Computing: Techniques and Applications (DICTA).

[11]  Alan Wee-Chung Liew,et al.  Deep Heterogeneous Ensemble , 2019, Aust. J. Intell. Inf. Process. Syst..

[12]  Jiri Matas,et al.  On Combining Classifiers , 1998, IEEE Trans. Pattern Anal. Mach. Intell..

[13]  Alan Wee-Chung Liew,et al.  A novel combining classifier method based on Variational Inference , 2016, Pattern Recognit..