A Novel Adaptive-Boost-Based Strategy for Combining Classifiers Using Diversity Concept

In classifiers combination, the diversity rate among classifier's outputs is one of the most important discussions. There are different methods for combining classifiers. AdaBoost is an incremental method for creating a classifiers ensemble in which every AdaBoost algorithm has a local centrality. It means that classifiers are data biased and classify special data. In this paper we intend to find a new method for combining classifiers by using AdaBoost method and diversity concept. We have checked this method over different data sets and compared results of this method with others. These results indicate that we can develop other versions of this method for achieving a better performance.

[1]  Subhash C. Bagui,et al.  Combining Pattern Classifiers: Methods and Algorithms , 2005, Technometrics.

[2]  Padraig Cunningham,et al.  Diversity versus Quality in Classification Ensembles Based on Feature Selection , 2000, ECML.

[3]  Tin Kam Ho,et al.  MULTIPLE CLASSIFIER COMBINATION: LESSONS AND NEXT STEPS , 2002 .

[4]  Robert E. Schapire,et al.  Theoretical Views of Boosting , 1999, EuroCOLT.

[5]  P. Sneath,et al.  Numerical Taxonomy , 1962, Nature.

[6]  Derek Partridge,et al.  Software Diversity: Practical Statistics for Its Measurement and Exploitation | Draft Currently under Revision , 1996 .

[7]  B. Everitt,et al.  Statistical methods for rates and proportions , 1973 .

[8]  Ron Kohavi,et al.  Bias Plus Variance Decomposition for Zero-One Loss Functions , 1996, ICML.

[9]  David B. Skalak,et al.  The Sources of Increased Accuracy for Two Proposed Boosting Algorithms , 1996, AAAI 1996.

[10]  Eric Bauer,et al.  An Empirical Comparison of Voting Classification Algorithms: Bagging, Boosting, and Variants , 1999, Machine Learning.

[11]  David G. Stork,et al.  Pattern Classification , 1973 .

[12]  C. J. Whitaker,et al.  Ten measures of diversity in classifier ensembles: limits for two classifiers , 2001 .

[13]  Fabio Roli,et al.  Design of effective neural network ensembles for image classification purposes , 2001, Image Vis. Comput..

[14]  G. Yule On the Association of Attributes in Statistics: With Illustrations from the Material of the Childhood Society, &c , 1900 .

[15]  Tin Kam Ho,et al.  The Random Subspace Method for Constructing Decision Forests , 1998, IEEE Trans. Pattern Anal. Mach. Intell..

[16]  G. Yule,et al.  On the association of attributes in statistics, with examples from the material of the childhood society, &c , 1900, Proceedings of the Royal Society of London.