A Method to Boost Naïve Bayesian Classifiers

In this paper, we introduce a new method to improve the performance of combining boosting and na?ve Bayesian. Instead of combining boosting and Na?ve Bayesian learning directly, which was proved to be unstatisfactory to improve performance, we select the training samples dynamically by bootstrap method for the construction of na?ve Bayesian classifiers, and hence generate very different or unstable base classifiers for boosting. Besides, we devise a modification for the weight adjusting of boosting algorithm in order to achieve this goal: minimizing the overlapping errors of its constituent classfiers. We conducted series of experiments, which show that the new method not only has performance much better than na?ve Bayesian classifiers or directly boosted na?ve Bayesian ones, but also much quicker to obtain optimal performance than boosting stumps and boosting decision trees incorporated with na?ve Bayesian learning.