When Does Overfitting Decrease Prediction Accuracy in Induced Decision Trees and Rule Sets?
暂无分享,去创建一个
Researchers studying classification techniques based on induced decision trees and rule sets have found that the model which best fits training data is unlikely to yield optimal performance on fresh data. Such a model is typically overfitted, in the sense that it captures not only true regularities reflected in the training data, but also chance patterns which have no significance for classification and, in fact, reduce the model's predictive accuracy. Various simplification methods have been shown to help avoid overfitting in practice. Here, through detailed analysis of a paradigmatic example, I attempt to uncover the conditions under which these techniques work as expected. One auxilliary result of importance is identification of conditions under which overfitting does not decrease predictive accuracy and hence in which it would be a mistake to apply simplification techniques, if predictive accuracy is the key goal.
[1] Tim Niblett,et al. Constructing Decision Trees in Noisy Domains , 1987, EWSL.
[2] J. Ross Quinlan,et al. Simplifying Decision Trees , 1987, Int. J. Man Mach. Stud..
[3] W. Scott Spangler,et al. Induction of Decision Trees from Inconclusive Data , 1989, ML.
[4] Sholom M. Weiss,et al. Optimizing the Predictive Value of Diagnostic Decision Rules , 1987, AAAI.
[5] Leo Breiman,et al. Classification and Regression Trees , 1984 .