Decision tree is with good comprehensibility while neural network ensemble is with strong generalization ability. These merits are integrated into a novel decision tree algorithm NeC4.5. This algorithm trains a neural network ensemble at first. Then, the trained ensemble is employed to generate a new training set through replacing the desired class labels of the original training examples with those output from the trained ensemble. Some extra training examples are also generated from the trained ensemble and added to the new training set. Finally, a C4.5 decision tree is grown from the new training set. Since its learning results are decision trees, the comprehensibility of NeC4.5 is better than that of neural network ensemble. Moreover, experiments show that the generalization ability of NeC4.5 decision trees can be better than that of C4.5 decision trees.
[1]
Geoffrey E. Hinton,et al.
Learning internal representations by error propagation
,
1986
.
[2]
Alberto Maria Segre,et al.
Programs for Machine Learning
,
1994
.
[3]
Catherine Blake,et al.
UCI Repository of machine learning databases
,
1998
.
[4]
Nageswara S. V. Rao,et al.
On Fusers that Perform Better than Best Sensor
,
2001,
IEEE Trans. Pattern Anal. Mach. Intell..
[5]
Zhi-Hua Zhou,et al.
Hybrid decision tree
,
2002,
Knowl. Based Syst..
[6]
Wei Tang,et al.
Ensembling neural networks: Many could be better than all
,
2002,
Artif. Intell..
[7]
Leo Breiman,et al.
Bagging Predictors
,
1996,
Machine Learning.