A comprehensive experimental evaluation of orthogonal polynomial expanded random vector functional link neural networks for regression

Abstract The Random Vector Functional Link Neural Network (RVFLNN) enables fast learning through a random selection of input weights while learning procedure determines only output weights. Unlike Extreme Learning Machines (ELM), RVFLNN exploits connection between the input layer and the output layer which means that RVFLNN are higher class of networks. Although RVFLNN has been proposed more than two decades ago (Pao, Park, Sobajic, 1994), the nonlinear expansion of the input vector into set of orthogonal functions has not been studied. The Orthogonal Polynomial Expanded Random Vector Functional Link Neural Network (OPE-RVFLNN) utilizes advantages from expansion of the input vector and random determination of the input weights. Through comprehensive experimental evaluation by using 30 UCI regression datasets, we tested four orthogonal polynomials (Chebyshev, Hermite, Laguerre and Legendre) and three activation functions (tansig, logsig, tribas). Rigorous non-parametric statistical hypotheses testing confirms two major conclusions made by Zhang and Suganthan for classification (Zhang and Suganthan, 2015) and Ren et al. for timeseries prediction (Ren, Suganthan, Srikanth, Amaratunga, 2016) in their RVFLNN papers: direct links between the input and output vectors are essential for improved network performance, and ridge regression generates significantly better network parameters than Moore-Penrose pseudoinversion. Our research shows a significant improvement of network performance when one uses tansig activation function and Chebyshev orthogonal polynomial for regression problems. Conclusions drawn from this study may be used as guidelines for OPE-RVFLNN development and implementation for regression problems.

[1]  Ling Tang,et al.  A non-iterative decomposition-ensemble learning paradigm using RVFL network for crude oil price forecasting , 2017, Appl. Soft Comput..

[2]  Ponnuthurai N. Suganthan,et al.  Random vector functional link network for short-term electricity load demand forecasting , 2016, Inf. Sci..

[3]  Da Ruan,et al.  Pipelined functional link artificial recurrent neural network with the decision feedback structure for nonlinear channel equalization , 2011, Inf. Sci..

[4]  Kurt Hornik,et al.  Approximation capabilities of multilayer feedforward networks , 1991, Neural Networks.

[5]  Hubert A.B. Te Braake,et al.  Random activation weight neural net (RAWN) for fast non-iterative training. , 1995 .

[6]  Nitish Srivastava,et al.  Dropout: a simple way to prevent neural networks from overfitting , 2014, J. Mach. Learn. Res..

[7]  L. P. Wang,et al.  Comments on "The Extreme Learning Machine" , 2008, IEEE Trans. Neural Networks.

[8]  Francisco Herrera,et al.  A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms , 2011, Swarm Evol. Comput..

[9]  C. L. Philip Chen,et al.  A rapid learning and dynamic stepwise updating algorithm for flat neural networks and the application to time-series prediction , 1999, IEEE Trans. Syst. Man Cybern. Part B.

[10]  Li Fei-Fei,et al.  ImageNet: A large-scale hierarchical image database , 2009, CVPR.

[11]  Jürgen Schmidhuber,et al.  Deep learning in neural networks: An overview , 2014, Neural Networks.

[12]  Wan-De Weng,et al.  A channel equalizer using reduced decision feedback Chebyshev functional link artificial neural networks , 2007, Inf. Sci..

[13]  Dianhui Wang,et al.  A probabilistic learning algorithm for robust modeling using neural networks with random weights , 2015, Inf. Sci..

[14]  Francisco Herrera,et al.  A study on the use of statistical tests for experimentation with neural networks: Analysis of parametric test conditions and non-parametric tests , 2007, Expert Syst. Appl..

[15]  Pradipta Kishore Dash,et al.  NARX model based nonlinear dynamic system identification using low complexity neural networks and robust H∞ filter , 2013, Appl. Soft Comput..

[16]  Geoffrey E. Hinton,et al.  Deep Learning , 2015, Nature.

[17]  Najdan Vukovic,et al.  A growing and pruning sequential learning algorithm of hyper basis function neural network for function approximation , 2013, Neural Networks.

[18]  Robert P. W. Duin,et al.  Feedforward neural networks with random weights , 1992, Proceedings., 11th IAPR International Conference on Pattern Recognition. Vol.II. Conference B: Pattern Recognition Methodology and Systems.

[19]  Sung-Bae Cho,et al.  Evolutionarily optimized features in functional link neural network for classification , 2010, Expert Syst. Appl..

[20]  Dejan J. Sobajic,et al.  Learning and generalization characteristics of the random vector Functional-link net , 1994, Neurocomputing.

[21]  Danilo Comminiello,et al.  Functional link expansions for nonlinear modeling of audio and speech signals , 2015, 2015 International Joint Conference on Neural Networks (IJCNN).

[22]  Goutam Chakraborty,et al.  Nonlinear channel equalization for wireless communication systems using Legendre neural networks , 2009, Signal Process..

[23]  Kazuyuki Murase,et al.  Orthogonal least squares based complex-valued functional link network , 2012, Neural Networks.

[24]  Dianhui Wang,et al.  Distributed learning for Random Vector Functional-Link networks , 2015, Inf. Sci..

[25]  M. Friedman A Comparison of Alternative Tests of Significance for the Problem of $m$ Rankings , 1940 .

[26]  Hongxing Li,et al.  Fuzzy Neural Intelligent Systems , 2000 .

[27]  Indra Narayan Kar,et al.  On-line system identification of complex systems using Chebyshev neural networks , 2007, Appl. Soft Comput..

[28]  Sung-Bae Cho,et al.  An improved swarm optimized functional link artificial neural network (ISO-FLANN) for classification , 2012, J. Syst. Softw..

[29]  Najdan Vukovic,et al.  Robust sequential learning of feedforward neural networks in the presence of heavy-tailed noise , 2015, Neural Networks.

[30]  P. N. Suganthan,et al.  A comprehensive evaluation of random vector functional link networks , 2016, Inf. Sci..

[31]  Dianhui Wang,et al.  Fast decorrelated neural network ensembles with random weights , 2014, Inf. Sci..

[32]  Francisco Herrera,et al.  Advanced nonparametric tests for multiple comparisons in the design of experiments in computational intelligence and data mining: Experimental analysis of power , 2010, Inf. Sci..

[33]  Zoran Miljković,et al.  Neural extended Kalman filter for monocular SLAM in indoor environment , 2016 .

[34]  Janez Demsar,et al.  Statistical Comparisons of Classifiers over Multiple Data Sets , 2006, J. Mach. Learn. Res..

[35]  Sung-Bae Cho,et al.  A comprehensive survey on functional link neural networks and an adaptive PSO–BP learning for CFLNN , 2010, Neural Computing and Applications.

[36]  Trevor Hastie,et al.  An Introduction to Statistical Learning , 2013, Springer Texts in Statistics.

[37]  M. Friedman The Use of Ranks to Avoid the Assumption of Normality Implicit in the Analysis of Variance , 1937 .

[38]  Debi Prasad Das,et al.  Functional link artificial neural network applied to active noise control of a mixture of tonal and chaotic noise , 2014, Appl. Soft Comput..

[39]  Le Zhang,et al.  Visual Tracking With Convolutional Random Vector Functional Link Network , 2017, IEEE Transactions on Cybernetics.

[40]  David J. Sheskin,et al.  Handbook of Parametric and Nonparametric Statistical Procedures , 1997 .

[41]  Leonardo Ramos Rodrigues,et al.  Building selective ensembles of Randomization Based Neural Networks with the successive projections algorithm , 2017, Appl. Soft Comput..

[42]  Le Zhang,et al.  A survey of randomized algorithms for training neural networks , 2016, Inf. Sci..

[43]  Chee Kheong Siew,et al.  Extreme learning machine: Theory and applications , 2006, Neurocomputing.