Using Feature Distribution Methods in Ensemble Systems Combined by Fusion and Selection-Based Methods

The main prerequisite for the efficient use of ensemble systems is that the base classifiers should be diverse among themselves. One way of increasing diversity is through the use of feature distribution methods in ensemble systems. In this paper, an investigation of the use of feature distribution methods among the classifiers of ensemble systems will be performed. In this investigation, five different methods of data distribution will be used. These ensemble systems will use six existing combination methods, in which four of them are fusion-based methods and the remaining two are selection-based methods. As a result, it is aimed to detect which ensemble systems are more suitable to use feature distribution among the classifier.

[1]  Anne M. P. Canuto,et al.  An analysis of data distribution methods in classifier combination systems , 2008, 2008 IEEE International Joint Conference on Neural Networks (IEEE World Congress on Computational Intelligence).

[2]  Anne M. P. Canuto,et al.  Investigating the influence of the choice of the ensemble members in accuracy and diversity of selection-based and fusion-based methods for ensembles , 2007, Pattern Recognit. Lett..

[3]  Catherine Blake,et al.  UCI Repository of machine learning databases , 1998 .

[4]  Kagan Tumer,et al.  Input decimated ensembles , 2003, Pattern Analysis & Applications.

[5]  Lakhmi C. Jain,et al.  Designing classifier fusion systems by genetic algorithms , 2000, IEEE Trans. Evol. Comput..

[6]  L. Darrell Whitley,et al.  Genetic Approach to Feature Selection for Ensemble Creation , 1999, GECCO.

[7]  Ludmila I. Kuncheva,et al.  Combining Pattern Classifiers: Methods and Algorithms , 2004 .

[8]  Subhash C. Bagui,et al.  Combining Pattern Classifiers: Methods and Algorithms , 2005, Technometrics.

[9]  Mykola Pechenizkiy,et al.  Diversity in search strategies for ensemble feature selection , 2005, Inf. Fusion.

[10]  Alexey Tsymbal,et al.  Ensemble feature selection with the simple Bayesian classification , 2003, Inf. Fusion.

[11]  Pragnesh Jay Modi,et al.  Classification of examples by multiple agents with private features , 2005, IEEE/WIC/ACM International Conference on Intelligent Agent Technology.

[12]  Peter Y. Chen,et al.  Correlation: Parametric and Nonparametric Measures , 2002 .

[13]  Luc Vandendorpe,et al.  Decision Fusion for Face Authentication , 2004, ICBA.

[14]  Anne M. P. Canuto,et al.  A Comparative Analysis of Feature Selection Methods for Ensembles with Different Combination Methods , 2007, 2007 International Joint Conference on Neural Networks.

[15]  Juan José Rodríguez Diez,et al.  Rotation Forest: A New Classifier Ensemble Method , 2006, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[16]  Tin Kam Ho,et al.  The Random Subspace Method for Constructing Decision Forests , 1998, IEEE Trans. Pattern Anal. Mach. Intell..

[17]  Vasant Honavar,et al.  Decision Tree Induction from Distributed Heterogeneous Autonomous Data Sources , 2003 .

[18]  David W. Opitz,et al.  Feature Selection for Ensembles , 1999, AAAI/IAAI.