AdaBoost boosts the performance of a weak learner by training a committee of weak learners which learn different features of the training sample space with different emphasis and jointly perform classification or regression of each new data sample by a weighted cumulative vote.We use RBF kernel classifiers to demonstrate that boosting a Strong Learner generally contributes to performance degradation, and identify three patterns of performance degradation due to three different strength levels of the underlying learner. We demonstrate that boosting productivity increases, peaks and then falls as the strength of the underlying learner increases. We highlight patterns of behaviour in the distribution and argue that AdaBoost's characteristic of forcing the strong learner to concentrate on the very hard samples or outliers with too much emphasis is the cause of performance degradation in Strong Learner boosting. However, by boosting an underlying classifier of appropriately low strength, we are able to boost the performance of the committee to achieve or surpass the performance levels achievable by strengthening the individual classifier with parameter or model selection in many instances. We conclude that, if the strength of the underlying learner approaches the identified strength levels, it is possible to avoid performance degradation and achieve high productivity in boosting by weakening the learner prior to boosting...
[1]
Gunnar Rätsch,et al.
An asymptotic analysis of AdaBoost in the binary classification case
,
1998
.
[2]
Robert E. Schapire,et al.
Theoretical Views of Boosting and Applications
,
1999,
ALT.
[3]
Robert E. Schapire,et al.
A Brief Introduction to Boosting
,
1999,
IJCAI.
[4]
J. Ross Quinlan,et al.
Bagging, Boosting, and C4.5
,
1996,
AAAI/IAAI, Vol. 1.