AdaBoost algorithm using multi-step correction

The convergence of the traditional AdaBoost (Adaptive Boosting) algorithm is improved by an AdaBoost algorithm with multi-step correction. In the algorithm, the update of the distribution of the training samples is related not only to the current classifier, but to previous classifiers as well. The algorithm modifies the weights of previously generated classifiers when a new classifier is aggregated. The experiments on the UCI "Diabetes" , "Heart statlog" and "Breast cancer Wisconsin" datasets indicate that the modified algorithm achieves better performance in both training and test errors than AdaBoost. The multi-step correction not only enhances the search efficiency for new member classifiers, but further improves the overall performance of the classifier ensemble as well.