Breiman, Friedman, Olshen and Stone (1984) expounded a method called Classification and Regression Trees, or CART, which can be used for nonparametric discrimination and regression. Taylor and Silverman (1993) presented a new splitting criterion for the growing of classification trees; this new criterion was called the mean posterior improvement criterion. This paper extends the mean posterior improvement criterion to the case of regression trees. The extension is made via kernel density estimation. General results on how to select the bandwidth (smoothing parameter) appropriate to estimation of the mean posterior improvement criterion are obtained. These results are adapted to allow a practical implementation of the mean posterior improvement criterion for regression trees. Examples of the behaviour of the new criterion relative to currently used splitting criteria are given.
[1]
R. Tibshirani,et al.
Generalized additive models for medical research
,
1986,
Statistical methods in medical research.
[2]
L. Goldstein,et al.
Efficient nonparametric testing by functional estimation
,
1991
.
[3]
Leo Breiman,et al.
Classification and Regression Trees
,
1984
.
[4]
B. Silverman,et al.
Block diagrams and splitting criteria for classification trees
,
1993
.
[5]
Daryl Pregibon,et al.
Tree-based models
,
1992
.
[6]
B. Silverman.
Density estimation for statistics and data analysis
,
1986
.
[7]
Mark R. Segal,et al.
Regression Trees for Censored Data
,
1988
.