Untrained weighted classifier combination with embedded ensemble pruning

One of the crucial problems of the classifier ensemble is the so-called combination rule which is responsible for establishing a single decision from the pool of predictors. The final decision is made on the basis of the outputs of individual classifiers. At the same time, some of the individuals do not contribute much to the collective decision and may be discarded. This paper discusses how to design an effective combination rule, based on support functions returned by individual classifiers. We express our interest in aggregation methods which do not require training, because in many real-life problems we do not have an abundance of training objects or we are working under time constraints. Additionally, we show how to use proposed operators for simultaneous classifier combination and ensemble pruning. Our proposed schemes have embedded classifier selection step, which is based on weight thresholding. The experimental analysis carried out on the set of benchmark datasets and backed up with a statistical analysis, proved the usefulness of the proposed method, especially when the number of class labels is high.

[1]  Robert P. W. Duin,et al.  The combining classifier: to train or not to train? , 2002, Object recognition supported by user interaction for service robots.

[2]  Boguslaw Cyganek,et al.  Ensemble of Tensor Classifiers Based on the Higher-Order Singular Value Decomposition , 2012, HAIS.

[3]  HerreraFrancisco,et al.  Advanced nonparametric tests for multiple comparisons in the design of experiments in computational intelligence and data mining , 2010 .

[4]  Krzysztof Walkowiak,et al.  Two-layer optimization of survivable overlay multicasting in elastic optical networks , 2014, Opt. Switch. Netw..

[5]  C. J. Whitaker,et al.  Ten measures of diversity in classifier ensembles: limits for two classifiers , 2001 .

[6]  Kagan Tumer,et al.  Analysis of decision boundaries in linearly combined neural classifiers , 1996, Pattern Recognit..

[7]  Bartosz Krawczyk,et al.  Experiments on simultaneous combination rule training and ensemble pruning algorithm , 2014, 2014 IEEE Symposium on Computational Intelligence in Ensemble Learning (CIEL).

[8]  C. K. Chow,et al.  Statistical Independence and Threshold Functions , 1965, IEEE Trans. Electron. Comput..

[9]  Thomas G. Dietterich,et al.  Solving Multiclass Learning Problems via Error-Correcting Output Codes , 1994, J. Artif. Intell. Res..

[10]  Huaxiang Zhang,et al.  A spectral clustering based ensemble pruning approach , 2014, Neurocomputing.

[11]  Kevin W. Bowyer,et al.  Combination of Multiple Classifiers Using Local Accuracy Estimates , 1997, IEEE Trans. Pattern Anal. Mach. Intell..

[12]  J Tabor,et al.  Cross-entropy clustering , 2012, Pattern Recognit..

[13]  Bartosz Krawczyk,et al.  Untrained Method for Ensemble Pruning and Weighted Combination , 2014, ISNN.

[14]  SzczurekAndrzej,et al.  Monitoring Volatile Organic Compound Emission Based on Semiconductor Gas Sensors , 2014 .

[15]  Gian Luca Marcialis,et al.  Fusion of appearance-based face recognition algorithms , 2004, Pattern Analysis and Applications.

[16]  Adam Krzyżak,et al.  Methods of combining multiple classifiers and their applications to handwriting recognition , 1992, IEEE Trans. Syst. Man Cybern..

[17]  Francisco Herrera,et al.  DRCW-OVO: Distance-based relative competence weighting combination for One-vs-One strategy in multi-class problems , 2015, Pattern Recognit..

[18]  Lars Kai Hansen,et al.  Neural Network Ensembles , 1990, IEEE Trans. Pattern Anal. Mach. Intell..

[19]  HerreraFrancisco,et al.  Dynamic classifier selection for One-vs-One strategy , 2013 .

[20]  Emilio Corchado,et al.  A survey of multiple classifier systems as hybrid systems , 2014, Inf. Fusion.

[21]  Marek Kurzynski,et al.  Optimal selection of ensemble classifiers using measures of competence and diversity of base classifiers , 2014, Neurocomputing.

[22]  Hiroshi Sako,et al.  Confidence Transformation for Combining Classifiers , 2004, Pattern Analysis and Applications.

[23]  Grigorios Tsoumakas,et al.  An Ensemble Pruning Primer , 2009, Applications of Supervised and Unsupervised Ensemble Methods.

[24]  Boguslaw Cyganek,et al.  Novel parallel algorithm for object recognition with the ensemble of classifiers based on the Higher-Order Singular Value Decomposition of prototype pattern tensors , 2014, 2014 International Conference on Computer Vision Theory and Applications (VISAPP).

[25]  Nageswara S. V. Rao A Generic Sensor Fusion Problem: Classification and Function Estimation , 2004, Multiple Classifier Systems.

[26]  Louis Vuurpijl,et al.  An overview and comparison of voting methods for pattern recognition , 2002, Proceedings Eighth International Workshop on Frontiers in Handwriting Recognition.

[27]  Konrad Jackowski,et al.  Fixed-size ensemble classifier system evolutionarily adapted to a recurring context with an unlimited pool of classifiers , 2013, Pattern Analysis and Applications.

[28]  Robert Burduk Classifier fusion with interval-valued weights , 2013, Pattern Recognit. Lett..

[29]  Lior Rokach,et al.  Feature set decomposition for decision trees , 2005, Intell. Data Anal..

[30]  Ching Y. Suen,et al.  A Method of Combining Multiple Experts for the Recognition of Unconstrained Handwritten Numerals , 1995, IEEE Trans. Pattern Anal. Mach. Intell..

[31]  Janez Demsar,et al.  Statistical Comparisons of Classifiers over Multiple Data Sets , 2006, J. Mach. Learn. Res..

[32]  Geoffrey E. Hinton,et al.  Adaptive Mixtures of Local Experts , 1991, Neural Computation.

[33]  Fabio Roli,et al.  Design of effective multiple classifier systems by clustering of classifiers , 2000, Proceedings 15th International Conference on Pattern Recognition. ICPR-2000.

[34]  Fabio Roli,et al.  Bayesian Analysis of Linear Combiners , 2007, MCS.

[35]  Chen Lin,et al.  LibD3C: Ensemble classifiers with a clustering and dynamic selection strategy , 2014, Neurocomputing.

[36]  Francisco Herrera,et al.  Dynamic classifier selection for One-vs-One strategy: Avoiding non-competent classifiers , 2013, Pattern Recognit..

[37]  Tin Kam Ho,et al.  The Random Subspace Method for Constructing Decision Forests , 1998, IEEE Trans. Pattern Anal. Mach. Intell..

[38]  Kaizhu Huang,et al.  Convex ensemble learning with sparsity and diversity , 2014, Inf. Fusion.

[39]  Michal Wozniak,et al.  Designing combining classifier with trained fuser — Analytical and experimental evaluation , 2010, 2010 10th International Conference on Intelligent Systems Design and Applications.

[40]  Jacek Tabor,et al.  Two ellipsoid Support Vector Machines , 2014, Expert Syst. Appl..

[41]  Bartosz Krawczyk,et al.  Improved Adaptive Splitting and Selection: the Hybrid Training Method of a Classifier Based on a Feature Space Partitioning , 2014, Int. J. Neural Syst..

[42]  Emilio Corchado,et al.  Hybrid Classification Ensemble Using Topology-preserving Clustering , 2011, New Generation Computing.

[43]  Francisco Herrera,et al.  Advanced nonparametric tests for multiple comparisons in the design of experiments in computational intelligence and data mining: Experimental analysis of power , 2010, Inf. Sci..

[44]  Krzysztof Walkowiak,et al.  Static Classifier Selection with Interval Weights of Base Classifiers , 2015, ACIIDS.

[45]  Francisco Herrera,et al.  Analyzing the presence of noise in multi-class problems: alleviating its influence with the One-vs-One decomposition , 2012, Knowledge and Information Systems.

[46]  James C. Bezdek,et al.  Decision templates for multiple classifier fusion: an experimental comparison , 2001, Pattern Recognit..

[47]  Michal Wozniak,et al.  Some Remarks on Chosen Methods of Classifier Fusion Based on Weighted Voting , 2009, HAIS.

[48]  Ethem Alpaydın,et al.  Combined 5 x 2 cv F Test for Comparing Supervised Classification Learning Algorithms , 1999, Neural Comput..