Kernelized LARS–LASSO for constructing radial basis function neural networks

Model structure selection is of crucial importance in radial basis function (RBF) neural networks. Existing model structure selection algorithms are essentially forward selection or backward elimination methods that may lead to sub-optimal models. This paper proposes an alternative selection procedure based on the kernelized least angle regression (LARS)–least absolute shrinkage and selection operator (LASSO) method. By formulating the RBF neural network as a linear-in-the-parameters model, we derive a l1-constrained objective function for training the network. The proposed algorithm makes it possible to dynamically drop a previously selected regressor term that is insignificant. Furthermore, inspired by the idea of LARS, the computing of output weights in our algorithm is greatly simplified. Since our proposed algorithm can simultaneously conduct model structure selection and parameter optimization, a network with better generalization performance is built. Computational experiments with artificial and real world data confirm the efficacy of the proposed algorithm.

[1]  Zhigang Liu,et al.  A novel method of short-term load forecasting based on multiwavelet transform and multiple neural networks , 2011, Neural Computing and Applications.

[2]  Wei Chu,et al.  Bayesian support vector regression using a unified loss function , 2004, IEEE Transactions on Neural Networks.

[3]  S. Rosset,et al.  Piecewise linear regularized solution paths , 2007, 0708.2197.

[4]  R. Tibshirani,et al.  Regression shrinkage and selection via the lasso: a retrospective , 2011 .

[5]  R. Tibshirani,et al.  On the “degrees of freedom” of the lasso , 2007, 0712.0881.

[6]  Stephen A. Billings,et al.  Radial Basis Function Network Configuration Using Mutual Information and the Orthogonal Least Squares Algorithm , 1996, Neural Networks.

[7]  Peter Bühlmann Regression shrinkage and selection via the Lasso: a retrospective (Robert Tibshirani): Comments on the presentation , 2011 .

[8]  Shang-Liang Chen,et al.  Orthogonal least squares learning algorithm for radial basis function networks , 1991, IEEE Trans. Neural Networks.

[9]  Jooyoung Park,et al.  Universal Approximation Using Radial-Basis-Function Networks , 1991, Neural Computation.

[10]  H. Akaike A new look at the statistical model identification , 1974 .

[11]  Ming-Wei Chang,et al.  Leave-One-Out Bounds for Support Vector Regression Model Selection , 2005, Neural Computation.

[12]  George W. Irwin,et al.  Locally regularised two-stage learning algorithm for RBF network centre selection , 2012, Int. J. Syst. Sci..

[13]  George M. Furnival,et al.  Regressions by leaps and bounds , 2000 .

[14]  R. Tibshirani,et al.  Least angle regression , 2004, math/0406456.

[15]  André da Motta Salles Barreto,et al.  GOLS - Genetic orthogonal least squares algorithm for training RBF networks , 2006, Neurocomputing.

[16]  Xiangjie Liu,et al.  Neural sliding-mode load frequency controller design of power systems , 2011, Neural Computing and Applications.

[17]  Mansour Sheikhan,et al.  State of charge neural computational models for high energy density batteries in electric vehicles , 2012, Neural Computing and Applications.

[18]  Cheng Wu,et al.  Orthogonal Least Squares Algorithm for Training Cascade Neural Networks , 2012, IEEE Transactions on Circuits and Systems I: Regular Papers.

[19]  De-Shuang Huang,et al.  A Hybrid Forward Algorithm for RBF Neural Network Construction , 2006, IEEE Transactions on Neural Networks.

[20]  Rosalind W. Picard,et al.  On the efficiency of the orthogonal least squares training method for radial basis function networks , 1996, IEEE Trans. Neural Networks.

[21]  Kang Li,et al.  Two-Stage Mixed Discrete–Continuous Identification of Radial Basis Function (RBF) Neural Models for Nonlinear Systems , 2009, IEEE Transactions on Circuits and Systems I: Regular Papers.

[22]  R. Tibshirani Regression Shrinkage and Selection via the Lasso , 1996 .

[23]  Sheng Chen,et al.  Model selection approaches for non-linear system identification: a review , 2008, Int. J. Syst. Sci..

[24]  Dingli Yu,et al.  Selecting radial basis function network centers with recursive orthogonal least squares training , 2000, IEEE Trans. Neural Networks Learn. Syst..

[25]  Gang Wang,et al.  The Kernel Path in Kernelized LASSO , 2007, AISTATS.

[26]  Kang Li,et al.  System oriented neural networks -- problem formulation, methodology and application , 2006, Int. J. Pattern Recognit. Artif. Intell..

[27]  Michael J. Korenberg,et al.  Iterative fast orthogonal search algorithm for MDL-based training of generalized single-layer networks , 2000, Neural Networks.