Model Selection Via Multifold Cross Validation
暂无分享,去创建一个
In model selection, it is known that the simple leave one out cross validation method is apt to select overfitted models. In an attempt to remedy this problem, we consider two notions of multi-fold cross validation (MCV and MCV*) criteria. In the case of linear regression models, their performance is studied and compared with the simple CV method. As expected, it turns out that MCV indeed reduces the chance of overfitting. The intent ofMCV' is rather different from that of MCV. The differences between these two notions of MCV are also discussed. Our result explains the phenomena observed by Breiman & Spector.
[1] R. Shibata. Selection of the order of an autoregressive model by Akaike's information criterion , 1976 .
[2] B. Efron. How Biased is the Apparent Error Rate of a Prediction Rule , 1986 .
[3] W. Härdle,et al. How Far are Automatically Chosen Regression Smoothing Parameters from their Optimum , 1988 .
[4] J. Shao,et al. A General Theory for Jackknife Variance Estimation , 1989 .
[5] L. Breiman,et al. Submodel selection and evaluation in regression. The X-random case , 1992 .