Some Analysis and Research of the AdaBoost Algorithm
暂无分享,去创建一个
The AdaBoost algorithm enables weak classifiers to enhance their performance by establishing the set of multiple classifiers, and since it automatically adapts to the error rate of the basic algorithm in training through dynamic regulation of the weight of each sample, a wide range of concern has been aroused. This paper primarily makes some relevant introduction of Adaboost, and conducts an analysis and research of several aspects of the algorithm itself.
[1] R. Schapire. The Strength of Weak Learnability , 1990, Machine Learning.
[2] Yoav Freund,et al. A decision-theoretic generalization of on-line learning and an application to boosting , 1997, EuroCOLT.
[3] Jia Hui,et al. Fast Adaboost Training Algorithm by Dynamic Weight Trimming , 2009 .
[4] Yoav Freund,et al. Boosting a weak learning algorithm by majority , 1995, COLT '90.
[5] Ren Fang,et al. Study on combinability of SVM and AdaBoost algorithm , 2009 .