Ensemble methods for different classifiers like Bagging and Boosting which combine the decisions of multiple hypotheses are some of the strongest existing machine learning methods. The diversity of the members of an ensemble is known to be an important factor in determining its generalization error. DECORATE (Diverse Ensemble Creation by Oppositional Relabeling of Artificial Training Examples), that directly constructs diverse hypotheses using additional artificially-constructed training examples. The technique is a simple, general meta-learner that can use any strong learner as a base classifier to build diverse committees. The diverse ensembles produced by DECORATE are very effective for reducing the amount of supervision required for building accurate models. DECORATE ensembles can also be used to reduce supervision through active learning, in which the learner selects the most informative examples from a pool of unlabeled examples, such that acquiring their labels will increase the accuracy of the classifier.
[1]
Lawrence O. Hall,et al.
A Comparison of Decision Tree Ensemble Creation Techniques
,
2007,
IEEE Transactions on Pattern Analysis and Machine Intelligence.
[2]
Ludmila I. Kuncheva,et al.
Measures of Diversity in Classifier Ensembles and Their Relationship with the Ensemble Accuracy
,
2003,
Machine Learning.
[3]
Lawrence O. Hall,et al.
A Comparison of Decision Tree Ensemble Creation Techniques
,
2007
.
[4]
Raymond J. Mooney,et al.
Constructing Diverse Classifier Ensembles using Artificial Training Examples
,
2003,
IJCAI.
[5]
Bianca Zadrozny,et al.
Transforming classifier scores into accurate multiclass probability estimates
,
2002,
KDD.
[6]
Naoki Abe,et al.
Query Learning Strategies Using Boosting and Bagging
,
1998,
ICML.
[7]
Anders Krogh,et al.
Neural Network Ensembles, Cross Validation, and Active Learning
,
1994,
NIPS.