On the closure of the set of functions that can be realized by a given multilayer perceptron

Given a multilayer perceptron (MLP) with a fixed architecture, there are functions that can be approximated up to any degree of accuracy, without having to increase the number of the hidden nodes. Those functions belong to the closure F of the set F of the maps realizable by the MLP. In this paper, we give a list of maps with this property. In particular, it is proven that 1) rational functions belongs to F for networks with inverse tangent activation function and 2) products of polynomials and exponentials belongs to F for networks with sigmoid activation function. Moreover, for a restricted class of MLP's, we prove that the list is complete and give an analytic definition of F.

[1]  George Cybenko,et al.  Approximation by superpositions of a sigmoidal function , 1992, Math. Control. Signals Syst..

[2]  Anders Krogh,et al.  Introduction to the theory of neural computation , 1994, The advanced book program.

[3]  Kurt Hornik,et al.  Some new results on neural network approximation , 1993, Neural Networks.

[4]  D. Braess Nonlinear Approximation Theory , 1986 .

[5]  Irene A. Stegun,et al.  Handbook of Mathematical Functions. , 1966 .

[6]  Peter Lancaster,et al.  The theory of matrices , 1969 .

[7]  Ah Chung Tsoi,et al.  Universal Approximation Using Feedforward Neural Networks: A Survey of Some Existing Methods, and Some New Results , 1998, Neural Networks.

[8]  Kurt Hornik,et al.  Multilayer feedforward networks are universal approximators , 1989, Neural Networks.

[9]  Charles A. Micchelli,et al.  Dimension-independent bounds on the degree of approximation by neural networks , 1994, IBM J. Res. Dev..

[10]  Uwe Helmke,et al.  Existence and uniqueness results for neural network approximations , 1995, IEEE Trans. Neural Networks.

[11]  Ken-ichi Funahashi,et al.  On the approximate realization of continuous mappings by neural networks , 1989, Neural Networks.

[12]  Andrew R. Barron,et al.  Universal approximation bounds for superpositions of a sigmoidal function , 1993, IEEE Trans. Inf. Theory.

[13]  Allan Pinkus,et al.  Multilayer Feedforward Networks with a Non-Polynomial Activation Function Can Approximate Any Function , 1991, Neural Networks.

[14]  S. W. Ellacott,et al.  Aspects of the numerical analysis of neural networks , 1994, Acta Numerica.