Abstract It is well known that nonlinear approximation has an advantage over linear schemes in the sense that it provides comparable approximation rates to those of the linear schemes, but to a larger class of approximands. This was established for spline approximations and for wavelet approximations, and more recently by DeVore and Ron (in press) [2] for homogeneous radial basis function (surface spline) approximations. However, no such results are known for the Gaussian function, the preferred kernel in machine learning and several engineering problems. We introduce and analyze in this paper a new algorithm for approximating functions using translates of Gaussian functions with varying tension parameters. At heart it employs the strategy for nonlinear approximation of DeVore–Ron, but it selects kernels by a method that is not straightforward. The crux of the difficulty lies in the necessity to vary the tension parameter in the Gaussian function spatially according to local information about the approximand: error analysis of Gaussian approximation schemes with varying tension are, by and large, an elusive target for approximators. We show that our algorithm is suitably optimal in the sense that it provides approximation rates similar to other established nonlinear methodologies like spline and wavelet approximations. As expected and desired, the approximation rates can be as high as needed and are essentially saturated only by the smoothness of the approximand.
[1]
C. D. Boor,et al.
Fourier analysis of the approximation power of principal shift-invariant spaces
,
1992
.
[2]
R. E. Carlson,et al.
Improved accuracy of multiquadric interpolation using variable shape parameters
,
1992
.
[3]
Charles Fefferman,et al.
Some Maximal Inequalities
,
1971
.
[4]
L. Grafakos.
Classical and modern Fourier analysis
,
2003
.
[5]
Amos Ron,et al.
Approximation using scattered shifts of a multivariate function
,
2008,
0802.2517.
[6]
Ingo Steinwart,et al.
Fast rates for support vector machines using Gaussian kernels
,
2007,
0708.1838.
[7]
Bengt Fornberg,et al.
The Runge phenomenon and spatially variable shape parameters in RBF interpolation
,
2007,
Comput. Math. Appl..
[8]
Y. Meyer.
Wavelets and Operators
,
1993
.
[9]
Yiming Ying,et al.
Learnability of Gaussians with Flexible Variances
,
2007,
J. Mach. Learn. Res..
[10]
Bernhard Schölkopf,et al.
Comparing support vector machines with Gaussian kernels to radial basis function classifiers
,
1997,
IEEE Trans. Signal Process..
[11]
Xin Li,et al.
Approximation by radial bases and neural networks
,
2004,
Numerical Algorithms.
[12]
Loukas Grafakos,et al.
Modern Fourier Analysis
,
2008
.