Stochastic Configuration Networks: Fundamentals and Algorithms

This paper contributes to the development of randomized methods for neural networks. The proposed learner model is generated incrementally by stochastic configuration (SC) algorithms, termed SC networks (SCNs). In contrast to the existing randomized learning algorithms for single layer feed-forward networks, we randomly assign the input weights and biases of the hidden nodes in the light of a supervisory mechanism, and the output weights are analytically evaluated in either a constructive or selective manner. As fundamentals of SCN-based data modeling techniques, we establish some theoretical results on the universal approximation property. Three versions of SC algorithms are presented for data regression and classification problems in this paper. Simulation results concerning both data regression and classification indicate some remarkable merits of our proposed SCNs in terms of less human intervention on the network size setting, the scope adaptation of random parameters, fast learning, and sound generalization.

[1]  Y. Takefuji,et al.  Functional-link net computing: theory, system architecture, and functionalities , 1992, Computer.

[2]  Dianhui Wang,et al.  Stochastic configuration networks ensemble with heterogeneous features for large-scale data analytics , 2017, Inf. Sci..

[3]  Ming Li,et al.  Insights into randomized algorithms for neural networks: Practical issues and common pitfalls , 2017, Inf. Sci..

[4]  L. Jones A Simple Lemma on Greedy Approximation in Hilbert Space and Convergence Rates for Projection Pursuit Regression and Neural Network Training , 1992 .

[5]  Robert Hecht-Nielsen,et al.  Theory of the backpropagation neural network , 1989, International 1989 Joint Conference on Neural Networks.

[6]  Yee Whye Teh,et al.  A Fast Learning Algorithm for Deep Belief Nets , 2006, Neural Computation.

[7]  P. Lancaster,et al.  The theory of matrices : with applications , 1985 .

[8]  Geoffrey E. Hinton,et al.  Reducing the Dimensionality of Data with Neural Networks , 2006, Science.

[9]  George Cybenko,et al.  Approximation by superpositions of a sigmoidal function , 1989, Math. Control. Signals Syst..

[10]  Andrew R. Barron,et al.  Universal approximation bounds for superpositions of a sigmoidal function , 1993, IEEE Trans. Inf. Theory.

[11]  James T. Kwok,et al.  Objective functions for training new hidden units in constructive neural networks , 1997, IEEE Trans. Neural Networks.

[12]  Dejan J. Sobajic,et al.  Learning and generalization characteristics of the random vector Functional-link net , 1994, Neurocomputing.

[13]  Guigang Zhang,et al.  Deep Learning , 2016, Int. J. Semantic Comput..

[14]  Geoffrey E. Hinton,et al.  Speech recognition with deep recurrent neural networks , 2013, 2013 IEEE International Conference on Acoustics, Speech and Signal Processing.

[15]  Robert P. W. Duin,et al.  Feedforward neural networks with random weights , 1992, Proceedings., 11th IAPR International Conference on Pattern Recognition. Vol.II. Conference B: Pattern Recognition Methodology and Systems.

[16]  George Cybenko,et al.  Approximation by superpositions of a sigmoidal function , 1992, Math. Control. Signals Syst..

[17]  Dit-Yan Yeung,et al.  Constructive neural networks: some practical considerations , 1994, Proceedings of 1994 IEEE International Conference on Neural Networks (ICNN'94).

[18]  Dirk Husmeier,et al.  Random Vector Functional Link (RVFL) Networks , 1999 .

[19]  Dianhui Wang,et al.  Randomness in neural networks: an overview , 2017, WIREs Data Mining Knowl. Discov..

[20]  Kurt Hornik,et al.  Multilayer feedforward networks are universal approximators , 1989, Neural Networks.

[21]  Yoh-Han Pao,et al.  Stochastic choice of basis functions in adaptive function approximation and the functional-link net , 1995, IEEE Trans. Neural Networks.

[22]  D. E. Rumelhart,et al.  Learning internal representations by back-propagating errors , 1986 .

[23]  Jun Yu,et al.  Coupled Deep Autoencoder for Single Image Super-Resolution , 2017, IEEE Transactions on Cybernetics.

[24]  David S. Broomhead,et al.  Multivariable Functional Interpolation and Adaptive Networks , 1988, Complex Syst..

[25]  Dirk Husmeier,et al.  Neural networks for conditional probability estimation - forecasting beyond point predictions , 1999, Perspectives in neural computing.

[26]  Hong Chen,et al.  Universal approximation to nonlinear operators by neural networks with arbitrary activation functions and its application to dynamical systems , 1995, IEEE Trans. Neural Networks.

[27]  Ming Li,et al.  Robust stochastic configuration networks with kernel density estimation for uncertain data regression , 2017, Inf. Sci..

[28]  Ivan Tyukin,et al.  Approximation with random bases: Pro et Contra , 2015, Inf. Sci..

[29]  D. Broomhead,et al.  Radial Basis Functions, Multi-Variable Functional Interpolation and Adaptive Networks , 1988 .

[30]  Jooyoung Park,et al.  Universal Approximation Using Radial-Basis-Function Networks , 1991, Neural Computation.

[31]  Chi Zhang,et al.  Investigating Patterns for Self-Induced Emotion Recognition from EEG Signals , 2018, Sensors.

[32]  Dianhui Wang,et al.  Distributed learning for Random Vector Functional-Link networks , 2015, Inf. Sci..

[33]  Michael W. Mahoney Randomized Algorithms for Matrices and Data , 2011, Found. Trends Mach. Learn..

[34]  Ming Li,et al.  Deep Stochastic Configuration Networks with Universal Approximation Property , 2017, 2018 International Joint Conference on Neural Networks (IJCNN).

[35]  Dianhui Wang,et al.  Bayesian Random Vector Functional-Link Networks for Robust Data Modeling , 2018, IEEE Transactions on Cybernetics.

[36]  Herbert Jaeger,et al.  Reservoir computing approaches to recurrent neural network training , 2009, Comput. Sci. Rev..

[37]  Dianhui Wang,et al.  Editorial: Randomized algorithms for training neural networks , 2016, Inf. Sci..

[38]  Ivan Tyukin,et al.  Blessing of dimensionality: mathematical foundations of the statistical physics of data , 2018, Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences.

[39]  Ivan Tyukin,et al.  Feasibility of random basis function approximators for modeling and control , 2009, 2009 IEEE Control Applications, (CCA) & Intelligent Control, (ISIC).

[40]  Geoffrey E. Hinton,et al.  Learning representations by back-propagating errors , 1986, Nature.