Suppose the observations (ti,yi), i = 1,… n, follow the model
where gj are unknown functions. The estimation of the additive components can be done by approximating gj, with a function made up of the sum of a linear fit and a truncated Fourier series of cosines and minimizing a penalized least-squares loss function over the coefficients. This finite-dimensional basis approximation, when fitting an additive model with r predictors, has the advantage of reducing the computations drastically, since it does not require the use of the backfitting algorithm. The cross-validation (CV) [or generalized cross-validation (GCV)] for the additive fit is calculated in a further 0(n) operations. A search path in the r-dimensional space of degrees of freedom is proposed along which the CV (GCV) continuously decreases. The path ends when an increase in the degrees of freedom of any of the predictors yields an increase in CV (GCV). This procedure is illustrated on a meteorological data set.
[1]
G. Wahba.
Spline Models for Observational Data
,
1990
.
[2]
L. Breiman.
Discussion: Linear Smoothers and Additive Models
,
1989
.
[3]
R. Tibshirani,et al.
Linear Smoothers and Additive Models
,
1989
.
[4]
J. Chambers,et al.
The New S Language
,
1989
.
[5]
J. Friedman,et al.
Estimating Optimal Transformations for Multiple Regression and Correlation.
,
1985
.
[6]
B. Yandell,et al.
Semi-Parametric Generalized Linear Models.
,
1985
.
[7]
G. Wahba.
Bayesian "Confidence Intervals" for the Cross-validated Smoothing Spline
,
1983
.
[8]
J. Friedman,et al.
Projection Pursuit Regression
,
1981
.
[9]
Peter Craven,et al.
Smoothing noisy data with spline functions
,
1978
.