Special Invited Paper-Additive logistic regression: A statistical view of boosting

Boosting is one of the most important recent developments in classification methodology. Boosting works by sequentially applying a classification algorithm to reweighted versions of the training data and then taking a weighted majority vote of the sequence of classifiers thus produced. For many classification algorithms, this simple strategy results in dramatic improvements in performance. We show that this seemingly mysterious phenomenon can be understood in terms of well-known statistical principles, namely additive modeling and maximum likelihood. For the two-class problem, boosting can be viewed as an approximation to additive modeling on the logistic scale using maximum Bernoulli likelihood as a criterion. We develop more direct approximations and show that they exhibit nearly identical results to boosting. Direct multiclass generalizations based on multinomial likelihood are derived that exhibit performance comparable to other recently proposed multiclass generalizations of boosting in most situations, and far superior in some. We suggest a minor modification to boosting that can reduce computation, often by factors of 10 to 50. Finally, we apply these insights to produce an alternative formulation of boosting decision trees. This approach, based on best-first truncated tree induction, often leads to better performance, and can provide interpretable descriptions of the aggregate decision rule. It is also much faster computationally, making it more suitable to large-scale data mining applications.

[1]  Peter E. Hart,et al.  Nearest neighbor pattern classification , 1967, IEEE Trans. Inf. Theory.

[2]  Frederick R. Forst,et al.  On robust estimation of the location parameter , 1980 .

[3]  J. Friedman,et al.  Projection Pursuit Regression , 1981 .

[4]  Philip E. Gill,et al.  Practical optimization , 1981 .

[5]  Leslie G. Valiant,et al.  A theory of the learnable , 1984, STOC '84.

[6]  Yoav Freund,et al.  Boosting a weak learning algorithm by majority , 1995, COLT '90.

[7]  Darrel E. Bostow,et al.  An experimental comparison of three methods of instruction in health education for cancer prevention: traditional paper prose text, passive non-interactive computer presentation and overt-interactive computer presentation , 1992 .

[8]  Stéphane Mallat,et al.  Matching pursuits with time-frequency dictionaries , 1993, IEEE Trans. Signal Process..

[9]  R. Tibshirani,et al.  Flexible Discriminant Analysis by Optimal Scoring , 1994 .

[10]  Umesh V. Vazirani,et al.  An Introduction to Computational Learning Theory , 1994 .

[11]  Corinna Cortes,et al.  Boosting Decision Trees , 1995, NIPS.

[12]  Yoav Freund,et al.  A decision-theoretic generalization of on-line learning and an application to boosting , 1995, EuroCOLT.

[13]  Yoav Freund,et al.  Experiments with a New Boosting Algorithm , 1996, ICML.

[14]  J. Ross Quinlan,et al.  Bagging, Boosting, and C4.5 , 1996, AAAI/IAAI, Vol. 1.

[15]  J. Ross Quinlan,et al.  Boosting First-Order Learning , 1996, ALT.

[16]  Yoav Freund,et al.  Game theory, on-line prediction and boosting , 1996, COLT '96.

[17]  Leo Breiman,et al.  Bias, Variance , And Arcing Classifiers , 1996 .

[18]  Yoav Freund,et al.  Boosting the margin: A new explanation for the effectiveness of voting methods , 1997, ICML.

[19]  Robert E. Schapire,et al.  Using output codes to boost multiclass learning problems , 1997, ICML.

[20]  Yali Amit,et al.  Shape Quantization and Recognition with Randomized Trees , 1997, Neural Computation.

[21]  Robert Tibshirani,et al.  Classification by Pairwise Coupling , 1997, NIPS.

[22]  R. Tibshirani,et al.  Bayesian Backfitting , 1998 .

[23]  H. Chipman,et al.  Bayesian CART Model Search , 1998 .

[24]  Dale Schuurmans,et al.  Boosting in the Limit: Maximizing the Margin of Learned Ensembles , 1998, AAAI/IAAI.

[25]  Yoram Singer,et al.  Improved Boosting Algorithms Using Confidence-rated Predictions , 1998, COLT' 98.

[26]  R. Tibshirani,et al.  Classi cation by Pairwise Coupling , 1998 .

[27]  J. Heikkinen Curve and Surface Estimation Using Dynamic Step Functions , 1998 .

[28]  G. Ridgeway The State of Boosting ∗ , 1999 .

[29]  Yoav Freund,et al.  The Alternating Decision Tree Learning Algorithm , 1999, ICML.

[30]  L. Breiman USING ADAPTIVE BAGGING TO DEBIAS REGRESSIONS , 1999 .

[31]  Leo Breiman,et al.  Prediction Games and Arcing Algorithms , 1999, Neural Computation.

[32]  Yoav Freund,et al.  An Adaptive Version of the Boost by Majority Algorithm , 1999, COLT '99.

[33]  SwitzerlandBin YuBell Explaining Bagging , 2000 .

[34]  J. Friedman Greedy function approximation: A gradient boosting machine. , 2001 .

[35]  R. Kass,et al.  Bayesian curve-fitting with free-knot splines , 2001 .

[36]  David G. T. Denison,et al.  Boosting with Bayesian stumps , 2001, Stat. Comput..

[37]  Eric R. Ziegel,et al.  Generalized Linear Models , 2002, Technometrics.

[38]  J. Friedman Stochastic gradient boosting , 2002 .