Sparse incremental regression modeling using correlation criterion with boosting search

A novel technique is presented to construct sparse generalized Gaussian kernel regression models. The proposed method appends regressors in an incremental modeling by tuning the mean vector and diagonal covariance matrix of an individual Gaussian regressor to best fit the training data, based on a correlation criterion. It is shown that this is identical to incrementally minimizing the modeling mean square error (MSE). The optimization at each regression stage is carried out with a simple search algorithm re-enforced by boosting. Experimental results obtained using this technique demonstrate that it offers a viable alternative to the existing state-of-the-art kernel modeling methods for constructing parsimonious models.

[1]  Sheng Chen,et al.  Combined genetic algorithm optimization and regularized orthogonal least squares learning for radial basis function networks , 1999, IEEE Trans. Neural Networks.

[2]  Bing Lam Luk,et al.  Adaptive simulated annealing for optimization in signal processing applications , 1999, Signal Process..

[3]  Sheng Chen,et al.  Recursive hybrid algorithm for non-linear system identification using radial basis function networks , 1992 .

[4]  Sheng Chen Nonlinear time series modelling and prediction using Gaussian RBF networks with enhanced clustering and RLS learning , 1995 .

[5]  Vladimir N. Vapnik,et al.  The Nature of Statistical Learning Theory , 2000, Statistics for Engineering and Information Science.

[6]  Bernhard Schölkopf,et al.  Comparing support vector machines with Gaussian kernels to radial basis function classifiers , 1997, IEEE Trans. Signal Process..

[7]  Robert E. Schapire,et al.  The strength of weak learnability , 1990, Mach. Learn..

[8]  Sam Kwong,et al.  Genetic Algorithms : Concepts and Designs , 1998 .

[9]  M. Stone Cross‐Validatory Choice and Assessment of Statistical Predictions , 1976 .

[10]  George Eastman House,et al.  Sparse Bayesian Learning and the Relevan e Ve tor Ma hine , 2001 .

[11]  Sheng Chen,et al.  Sparse modeling using orthogonal forward regression with PRESS statistic and regularization , 2004, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[12]  David E. Goldberg,et al.  Genetic Algorithms in Search Optimization and Machine Learning , 1988 .

[13]  H. Akaike A new look at the statistical model identification , 1974 .

[14]  Christian Lebiere,et al.  The Cascade-Correlation Learning Architecture , 1989, NIPS.

[15]  S. A. Billings,et al.  The identification of linear and non-linear models of a turbocharged automotive diesel engine , 1989 .

[16]  John Moody,et al.  Fast Learning in Networks of Locally-Tuned Processing Units , 1989, Neural Computation.

[17]  Sheng Chen,et al.  Orthogonal least squares methods and their application to non-linear system identification , 1989 .

[18]  R. H. Myers Classical and modern regression with applications , 1986 .

[19]  Sheng Chen,et al.  Sparse kernel regression modeling using combined locally regularized orthogonal least squares and D-optimality experimental design , 2003, IEEE Trans. Autom. Control..

[20]  Shang-Liang Chen,et al.  Orthogonal least squares learning algorithm for radial basis function networks , 1991, IEEE Trans. Neural Networks.

[21]  S. Nonlinear time series modelling and prediction using Gaussian RBF networks with enhanced clustering and RLS learning , 2004 .

[22]  A. Atiya,et al.  Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond , 2005, IEEE Transactions on Neural Networks.

[23]  Gunnar Rätsch,et al.  An Introduction to Boosting and Leveraging , 2002, Machine Learning Summer School.

[24]  Lester Ingber,et al.  Simulated annealing: Practice versus theory , 1993 .

[25]  Alexander J. Smola,et al.  Support Vector Method for Function Approximation, Regression Estimation and Signal Processing , 1996, NIPS.

[26]  Yoav Freund,et al.  A decision-theoretic generalization of on-line learning and an application to boosting , 1995, EuroCOLT.