Statistical properties of the method of regularization with periodic Gaussian reproducing kernel

The method of regularization with the Gaussian reproducing kernel is popular in the machine learning literature and successful in many practical applications. In this paper we consider the periodic version of the Gaussian kernel regularization. We show in the white noise model setting, that in function spaces of very smooth functions, such as the infinite-order Sobolev space and the space of analytic functions, the method under consideration is asymptotically minimax; in finite-order Sobolev spaces, the method is rate optimal, and the efficiency in terms of constant when compared with the minimax estimator is reasonably high. The smoothing parameters in the periodic Gaussian regularization can be chosen adaptively without loss of asymptotic efficiency. The results derived in this paper give a partial explanation of the success of the Gaussian reproducing kernel in practice. Simulations are carried out to study the finite sample properties of the periodic Gaussian regularization.

[1]  A. Tsybakov,et al.  Oracle inequalities for inverse problems , 2002 .

[2]  Chong Gu Smoothing Spline Anova Models , 2002 .

[3]  M. Nussbaum,et al.  Asymptotic equivalence for nonparametric regression , 2002 .

[4]  Bernhard Schölkopf,et al.  Generalization Performance of Regularization Networks and Support Vector Machines via Entropy Numbers of Compact Operators , 1998 .

[5]  Bernhard Schölkopf,et al.  A Generalized Representer Theorem , 2001, COLT/EuroCOLT.

[6]  L. Brown,et al.  Direct asymptotic equivalence of nonparametric regression and the infinite dimensional location problem , 2001 .

[7]  Tomaso A. Poggio,et al.  Regularization Networks and Support Vector Machines , 2000, Adv. Comput. Math..

[8]  C. Mallows Some Comments on Cp , 2000, Technometrics.

[9]  G. Wahba Support vector machines, reproducing kernel Hilbert spaces, and randomized GACV , 1999 .

[10]  Ion Grama,et al.  Asymptotic equivalence for nonparametric generalized linear models , 1998 .

[11]  Bernhard Schölkopf,et al.  The connection between regularization operators and support vector kernels , 1998, Neural Networks.

[12]  Michael Nussbaum,et al.  Asymptotic Equivalence of Spectral Density and Regression Estimation , 1998 .

[13]  L. Brown,et al.  Asymptotic equivalence of nonparametric regression and white noise , 1996 .

[14]  M. Nussbaum Asymptotic Equivalence of Density Estimation and Gaussian White Noise , 1996 .

[15]  C. Mallows More comments on C p , 1995 .

[16]  A. Kneip Ordered Linear Smoothers , 1994 .

[17]  F. Girosi,et al.  From regularization to radial, tensor and additive splines , 1993, Neural Networks for Signal Processing III - Proceedings of the 1993 IEEE-SP Workshop.

[18]  Bernard W. Silverman,et al.  A comparison of the Reinsch and Speckman splines , 1992 .

[19]  G. Wahba Spline Models for Observational Data , 1990 .

[20]  Ker-Chau Li,et al.  Asymptotic Optimality for $C_p, C_L$, Cross-Validation and Generalized Cross-Validation: Discrete Index Set , 1987 .

[21]  Ker-Chau Li,et al.  Asymptotic optimality of CL and generalized cross-validation in ridge regression with application to spline smoothing , 1986 .

[22]  R. L. Dekock Some Comments , 2021 .

[23]  C. L. Mallows Some comments on C_p , 1973 .

[24]  G. Wahba,et al.  Some results on Tchebycheffian spline functions , 1971 .