Averaged Boosting: A Noise-Robust Ensemble Method
暂无分享,去创建一个
[1] Yoav Freund,et al. A decision-theoretic generalization of on-line learning and an application to boosting , 1997, EuroCOLT.
[2] Peter L. Bartlett,et al. Functional Gradient Techniques for Combining Hypotheses , 2000 .
[3] J. Friedman. Special Invited Paper-Additive logistic regression: A statistical view of boosting , 2000 .
[4] Leo Breiman,et al. Bagging Predictors , 1996, Machine Learning.
[5] Yoav Freund,et al. Boosting the margin: A new explanation for the effectiveness of voting methods , 1997, ICML.
[6] J. Ross Quinlan,et al. Boosting First-Order Learning , 1996, ALT.
[7] Eric Bauer,et al. An Empirical Comparison of Voting Classification Algorithms: Bagging, Boosting, and Variants , 1999, Machine Learning.
[8] Yoram Singer,et al. Improved Boosting Algorithms Using Confidence-rated Predictions , 1998, COLT' 98.
[9] J. Friedman. Greedy function approximation: A gradient boosting machine. , 2001 .
[10] D. Opitz,et al. Popular Ensemble Methods: An Empirical Study , 1999, J. Artif. Intell. Res..
[11] Leo Breiman,et al. Random Forests , 2001, Machine Learning.
[12] Gunnar Rätsch,et al. Soft Margins for AdaBoost , 2001, Machine Learning.
[13] G DietterichThomas. An Experimental Comparison of Three Methods for Constructing Ensembles of Decision Trees , 2000 .
[14] Thomas G. Dietterich. An Experimental Comparison of Three Methods for Constructing Ensembles of Decision Trees: Bagging, Boosting, and Randomization , 2000, Machine Learning.