Sigmoidal FFANN’s and the best approximation property

Abstract Feedforward Artificial Neural Networks (FFANN’s) can approximate any continuous function arbitrarily well. Many approximation schemes exist which have this universal approximation property including schemes based on polynomials, wavelets etc. These approximations schemes can be given a FFANN form. Thus, the universal approximation property is not the characteristic of choice for comparison of approximation schemes. The best approximation property is one of the crucial aspects that can be utilised for this purpose. We establish that the functional sets represented by finite sized networks as well as arbitrary sized networks are open. We prove this result for both linear output FFANN’s and sigmoidal output FFANN’s. We also establish the absence of the best approximation properties for these networks and discuss the physical relevance of the results obtained.

[1]  Tomaso A. Poggio,et al.  Extensions of a Theory of Networks for Approximation and Learning , 1990, NIPS.

[2]  George Finlay Simmons,et al.  Introduction to Topology and Modern Analysis , 1963 .

[3]  Ken-ichi Funahashi,et al.  On the approximate realization of continuous mappings by neural networks , 1989, Neural Networks.

[4]  Hong Chen,et al.  Approximation capability in C(R¯n) by multilayer feedforward networks and related problems , 1995, IEEE Trans. Neural Networks.

[5]  Ah Chung Tsoi,et al.  Universal Approximation Using Feedforward Neural Networks: A Survey of Some Existing Methods, and Some New Results , 1998, Neural Networks.

[6]  Vladik Kreinovich,et al.  Arbitrary nonlinearity is sufficient to represent all functions by neural networks: A theorem , 1991, Neural Networks.

[7]  C. Chui,et al.  Approximation by ridge functions and neural networks with one hidden layer , 1992 .

[8]  Andrew R. Barron,et al.  Universal approximation bounds for superpositions of a sigmoidal function , 1993, IEEE Trans. Inf. Theory.

[9]  Y. Singh,et al.  A class +1 sigmoidal activation functions for FFANNs , 2003 .

[10]  W. Rudin Principles of mathematical analysis , 1964 .

[11]  S. Sitharama Iyengar,et al.  Foundations of Wavelet Networks and Applications , 2002 .

[12]  S. Hyakin,et al.  Neural Networks: A Comprehensive Foundation , 1994 .

[13]  Halbert White,et al.  Regularized Neural Networks: Some Convergence Rate Results , 1995, Neural Computation.

[14]  Michael Schmitt,et al.  Lower Bounds on the Complexity of Approximating Continuous Functions by Sigmoidal Neural Networks , 1999, NIPS.

[15]  Kurt Hornik,et al.  Multilayer feedforward networks are universal approximators , 1989, Neural Networks.

[16]  Ward Cheney,et al.  A course in approximation theory , 1999 .

[17]  Hrushikesh Narhar Mhaskar,et al.  Fundamentals of Approximation Theory , 2000 .

[18]  Geoffrey E. Hinton,et al.  Learning internal representations by error propagation , 1986 .

[19]  T. Poggio,et al.  Networks and the best approximation property , 1990, Biological Cybernetics.