Generalized Single-Hidden Layer Feedforward Networks

In this paper, we propose a novel generalized single-hidden layer feedforward network (GSLFN) by employing polynomial functions of inputs as output weights connecting randomly generated hidden units with corresponding output nodes. The main contributions are as follows. For arbitrary N distinct observations with n-dimensional inputs, the augmented hidden node output matrix of the GSLFN with L hidden nodes using any infinitely differentiable activation functions consists of L sub-matrix blocks where each includes n+1 column vectors. The rank of the augmented hidden output matrix is proved to be no less than that of the SLFN, and thereby contributing to higher approximation performance. Furthermore, under minor constraints on input observations, we rigorously prove that the GLSFN with L hidden nodes can exactly learn L(n+1) arbitrary distinct observations which is n+1 times what the SLFN can learn. If the approximation error is allowed, by means of the optimization of output weight coefficients, the GSLFN may require less than N/(n+1) random hidden nodes to estimate targets with high accuracy. Theoretical results of the GSLFN evidently perform significant superiority to that of SLFNs.

[1]  Chee Kheong Siew,et al.  Extreme learning machine: Theory and applications , 2006, Neurocomputing.

[2]  Robert F. Stengel,et al.  Smooth function approximation using neural networks , 2005, IEEE Transactions on Neural Networks.

[3]  Guang-Bin Huang,et al.  Upper bounds on the number of hidden neurons in feedforward networks with arbitrary bounded nonlinear activation functions , 1998, IEEE Trans. Neural Networks.

[4]  Shin'ichi Tamura,et al.  Capabilities of a four-layered feedforward neural network: four layers versus three , 1997, IEEE Trans. Neural Networks.

[5]  Chee Kheong Siew,et al.  Universal Approximation using Incremental Constructive Feedforward Networks with Random Hidden Nodes , 2006, IEEE Transactions on Neural Networks.

[6]  Yonggwan Won,et al.  An Improvement of Extreme Learning Machine for Compact Single-Hidden-Layer Feedforward Neural Networks , 2008, Int. J. Neural Syst..

[7]  Xiang Li,et al.  An Online Self-Organizing Scheme for Parsimonious and Accurate Fuzzy Neural Networks , 2010, Int. J. Neural Syst..

[8]  Meng Joo Er,et al.  A fast and accurate online self-organizing scheme for parsimonious fuzzy neural networks , 2009, Neurocomputing.

[9]  Ning Wang,et al.  A Generalized Ellipsoidal Basis Function Based Online Self-constructing Fuzzy Neural Network , 2011, Neural Processing Letters.

[10]  Kay Chen Tan,et al.  Estimating the Number of Hidden Neurons in a Feedforward Network Using the Singular Value Decomposition , 2006, IEEE Trans. Neural Networks.