Tree Based Orthogonal Least Squares Regression with Repeated Weighted Boosting Search

Orthogonal Least Squares Regression (OLSR) selects each regressor by repeated weighted boosting search (RWBS). This kind of OLSR is known to be capable of producing a much sparser model than many other kernel methods. With the aid of tree structure search, this paper is to construct an even sparser regression model in the framework of OLSR with RWBS. When RWBS being used to solve the optimization at each regression stage, OLSR is extended by keeping the k ( k >1)excellent regressors, which minimize the modeling MSE, rather than only choose the best one at each iteration. In this way, the next regressor will be searched in k subspaces instead of in only one subspace as the conventional method. Furthermore we propose a subtree search to decrease experimental time complexity, by specifying the total number of children in every tree depth. The new schemes are shown to outperform the traditional method in the applications, such as component detection, sparse representation for ECG signal and 2-d time series modeling. Besides, experimental results also indicate that subtree based algorithm is with much lower time complexity than tree based one.

[1]  Sheng Chen,et al.  Experiments with repeating weighted boosting search for optimization signal processing applications , 2005, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[2]  G. Karabulut,et al.  Integrating flexible tree searches to orthogonal matching pursuit algorithm , 2006 .

[3]  Sheng Chen,et al.  Parsimonious least squares support vector regression using orthogonal forward selection with the generalised kernel model , 2006, Int. J. Model. Identif. Control..

[4]  Jiaxin Wang,et al.  Non-flat function estimation with a multi-scale support vector regression , 2006, Neurocomputing.

[5]  George Eastman House,et al.  Sparse Bayesian Learning and the Relevance Vector Machine , 2001 .

[6]  Sheng Chen,et al.  Regularized orthogonal least squares algorithm for constructing radial basis function networks , 1996 .

[7]  Xia Hong,et al.  Nonlinear model structure design and construction using orthogonal least squares and D-optimality design , 2002, IEEE Trans. Neural Networks.

[8]  Sheng Chen,et al.  Orthogonal least squares regression with tunable kernels , 2005 .

[9]  Sheng Chen,et al.  Combined genetic algorithm optimization and regularized orthogonal least squares learning for radial basis function networks , 1999, IEEE Trans. Neural Networks.

[10]  Bhaskar D. Rao,et al.  Application of tree-based searches to matching pursuit , 2001, 2001 IEEE International Conference on Acoustics, Speech, and Signal Processing. Proceedings (Cat. No.01CH37221).

[11]  Sheng Chen Locally regularised orthogonal least squares algorithm for the construction of sparse kernel regression models , 2002, 6th International Conference on Signal Processing, 2002..

[12]  Sheng Chen,et al.  Local regularization assisted orthogonal least squares regression , 2006, Neurocomputing.

[13]  Alexander J. Smola,et al.  Regression estimation with support vector learning machines , 1996 .

[14]  X. Wanga,et al.  Sparse support vector regression based on orthogonal forward selection for the generalised kernel model , 2005 .

[15]  Sheng Chen,et al.  Orthogonal least squares methods and their application to non-linear system identification , 1989 .