Tight Bounds on Rates of Neural-Network Approximation

Complexity of neural networks measured by the number of hidden units is studied in terms of rates of approximation. Limitations of improvements of upper bounds of the order of O(n-1/2) on such rates are investigated for perceptron networks with some periodic and sigmoidal activation functions.

[1]  Charles A. Micchelli,et al.  Dimension-independent bounds on the degree of approximation by neural networks , 1994, IBM J. Res. Dev..

[2]  Marcello Sanguineti,et al.  Bounds on rates of variable-basis and neural-network approximation , 2001, IEEE Trans. Inf. Theory.

[3]  Kevin Warwick,et al.  Computer Intensive Methods in Control and Signal Processing: The Curse of Dimensionality , 1997 .

[4]  M. Sanguineti,et al.  Approximating Networks and Extended Ritz Method for the Solution of Functional Optimization Problems , 2002 .

[5]  Andrew R. Barron,et al.  Universal approximation bounds for superpositions of a sigmoidal function , 1993, IEEE Trans. Inf. Theory.

[6]  G. Pisier Remarques sur un résultat non publié de B. Maurey , 1981 .

[7]  Terrence J. Sejnowski,et al.  Parallel Networks that Learn to Pronounce English Text , 1987, Complex Syst..

[8]  Tomaso A. Poggio,et al.  Regularization Theory and Neural Networks Architectures , 1995, Neural Computation.

[9]  Y. Makovoz Random Approximants and Neural Networks , 1996 .

[10]  Vera Kurková,et al.  Rates of approximation of real-valued boolean functions by neural networks , 1998, ESANN.

[11]  L. Jones A Simple Lemma on Greedy Approximation in Hilbert Space and Convergence Rates for Projection Pursuit Regression and Neural Network Training , 1992 .

[12]  Věra Kůrková,et al.  Dimension-Independent Rates of Approximation by Neural Networks , 1997 .

[13]  Marcello Sanguineti,et al.  Tightness of Upper Bounds on Rates of Neural-Network Approximation , 2001 .