Improve the Performance of Random Forests by Introducing Weight Update Technique

We investigate approaches to improve the performance of random forests by introducing weight update and bootstrap techniques and propose a new algorithm that combine these techniques smoothly. Experiments show that the proposed approach performs better than the original RF and works well with different weight update techniques used by three most popular version of AdaBoost. At the same time there is no more parameters to adjust compared with RF.

[1]  Leo Breiman,et al.  Bagging Predictors , 1996, Machine Learning.

[2]  Yoav Freund,et al.  Boosting the margin: A new explanation for the effectiveness of voting methods , 1997, ICML.

[3]  Mykola Pechenizkiy,et al.  Dynamic Integration with Random Forests , 2006, ECML.

[4]  Marko Robnik-Sikonja,et al.  Improving Random Forests , 2004, ECML.

[5]  Leo Breiman,et al.  Random Forests , 2001, Machine Learning.

[6]  Yang Yu,et al.  Spectrum of Variable-Random Trees , 2008, J. Artif. Intell. Res..

[7]  Yoav Freund,et al.  Experiments with a New Boosting Algorithm , 1996, ICML.

[8]  Yanchun Zhang,et al.  AdaBoost algorithm with random forests for predicting breast cancer survivability , 2008, 2008 IEEE International Joint Conference on Neural Networks (IEEE World Congress on Computational Intelligence).

[9]  Gian Luca Foresti,et al.  Meta Random Forests , 2006 .