Bagging, Boosting, and bloating in Genetic Programming

We present an extension of GP (Genetic Programming) by means of resampling techniques, i.e., Bagging and Boosting. These methods both manipulate the training data in order to improve the learning algorithm. In theory they can significantly reduce the error of any weak learning algorithm by repeatedly running it. This paper extends GP by dividing a whole population into a set of sub-populations, each of which is evolvable by using the Bagging and Boosting methods. The effectiveness of our approach is shown by experiments. The performance is discussed by the comparison with the traditional GP in view of the bloating effect.