Using feedforward networks to distinguish multivariate populations

It is shown how feedforward neural networks can be used to construct convenient and informative tests for nonspecific differences between populations with multivariate attributes. The key to the power of these tests is of independent interest: under mild conditions, feedforward neural networks have the universal approximation property when parameterized by weights in arbitrarily small neighborhoods.<<ETX>>

[1]  J. Durbin Distribution theory for tests based on the sample distribution function , 1973 .

[2]  Halbert White,et al.  Approximating and learning unknown mappings using multilayer feedforward networks with bounded weights , 1990, 1990 IJCNN International Joint Conference on Neural Networks.

[3]  H. White Asymptotic theory for econometricians , 1985 .

[4]  George Cybenko,et al.  Approximation by superpositions of a sigmoidal function , 1992, Math. Control. Signals Syst..

[5]  H. White,et al.  Universal approximation using feedforward networks with non-sigmoid hidden layer activation functions , 1989, International 1989 Joint Conference on Neural Networks.

[6]  Kurt Hornik,et al.  Universal approximation of an unknown mapping and its derivatives using multilayer feedforward networks , 1990, Neural Networks.

[7]  Halbert White,et al.  Learning in Artificial Neural Networks: A Statistical Perspective , 1989, Neural Computation.

[8]  Kurt Hornik,et al.  Multilayer feedforward networks are universal approximators , 1989, Neural Networks.

[9]  Herman J. Bierens,et al.  A consistent conditional moment test of functional form , 1990 .

[10]  Yu. V. Prokhorov Convergence of Random Processes and Limit Theorems in Probability Theory , 1956 .

[11]  X. Ying Role of activation function on hidden units for sample recording in three-layer neural networks , 1990, 1990 IJCNN International Joint Conference on Neural Networks.

[12]  P. Bickel,et al.  Mathematical Statistics: Basic Ideas and Selected Topics , 1977 .