Multi-output LS-SVR machine in extended feature space

Support Vector Regression machine is usually used to predict a single output. Previous multi-output regression problems are dealt with by building up multiple independent single-output regression models. Taking into account the correlations between multi-outputs, a new method for constructing a multi-output model directly is presented. By extending the original feature space using the method of vector virtualization, the multi-output case is expressed as a formally equivalent single-output one in the extended feature space, which can be solved with least square support vector regression machines. Experimental results show that this method presents good performance.

[1]  Johan A. K. Suykens,et al.  Least Squares Support Vector Machine Classifiers , 1999, Neural Processing Letters.

[2]  Hans-Peter Kriegel,et al.  Multi-Output Regularized Feature Projection , 2006, IEEE Transactions on Knowledge and Data Engineering.

[3]  Yong Yu,et al.  Multi-output regression on the output manifold , 2009, Pattern Recognit..

[4]  Wei Zhang,et al.  Decremental multi-output least square SVR learning , 2011, 2011 IEEE International Conference on Computer Science and Automation Engineering.

[5]  E. Walter,et al.  Multi-Output Suppport Vector Regression , 2003 .

[6]  Vladimir N. Vapnik,et al.  The Nature of Statistical Learning Theory , 2000, Statistics for Engineering and Information Science.

[7]  David J. Fleet,et al.  This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE Gaussian Process Dynamical Model , 2007 .

[8]  Bernhard Schölkopf,et al.  A tutorial on support vector regression , 2004, Stat. Comput..

[9]  Mark Brudnak Vector-Valued Support Vector Regression , 2006, The 2006 IEEE International Joint Conference on Neural Network Proceedings.