Quaternionic and complex-valued Support Vector Regression for Equalization and Function Approximation

Support vector regressors (SVRs) are a class of nonlinear regressor inspired by Vapnik's support vector (SV) method for pattern classification. The standard SVR has been successfully applied to real number regression problems such as financial prediction and weather forecasting. However in some applications the domain of the function to be estimated may be more naturally and efficiently expressed using complex numbers (eg. communications channels) or quaternions (eg. 3-dimensional geometrical problems). Since SVRs have previously been proven to be efficient and accurate regressors, the extension of this method to complex numbers and quaternions is of great interest. In the present paper the standard SVR method is extended to cover regression in complex numbers and quaternions. Our method differs from existing approaches in-so-far as the cost function applied in the output space is rotationally invariant, which is important as in most cases it is the magnitude of the error in the output which is important, not the angle. We demonstrate the practical usefulness of this new formulation by considering the problem of communications channel equalization.

[1]  T. M. Williams,et al.  Practical Methods of Optimization. Vol. 1: Unconstrained Optimization , 1980 .

[2]  J. Mercer Functions of Positive and Negative Type, and their Connection with the Theory of Integral Equations , 1909 .

[3]  Isabelle Guyon,et al.  Automatic Capacity Tuning of Very Large VC-Dimension Classifiers , 1992, NIPS.

[4]  Jack B. Kuipers,et al.  Quaternions and Rotation Sequences: A Primer with Applications to Orbits, Aerospace and Virtual Reality , 2002 .

[5]  E. Bayro-Corrochano,et al.  SVMs using geometric algebra for 3D computer vision , 2001, IJCNN'01. International Joint Conference on Neural Networks. Proceedings (Cat. No.01CH37222).

[6]  M. Palaniswami,et al.  A modified /spl nu/-SV method for simplified regression , 2004, International Conference on Intelligent Sensing and Information Processing, 2004. Proceedings of.

[7]  Alistair Shilton,et al.  Design and training of support vector machines , 2006 .

[8]  M. Palaniswami A Modified ν-SV Method For Simplified Regression , 2006 .

[9]  Corinna Cortes,et al.  Support-Vector Networks , 1995, Machine Learning.

[10]  Alexander J. Smola,et al.  Regression estimation with support vector learning machines , 1996 .

[11]  Stephan Dempe,et al.  Foundations of Bilevel Programming , 2002 .

[12]  Andrew Chang Quaternionic analysis , 1979, Mathematical Proceedings of the Cambridge Philosophical Society.

[13]  William Rowan Hamilton,et al.  ON QUATERNIONS, OR ON A NEW SYSTEM OF IMAGINARIES IN ALGEBRA , 1847 .

[14]  Sir William Rowan Hamilton Ll.D. P.R.I.A. XXXI. On quaternions; or on a new system of imaginaries in algebra , 1845 .

[15]  R. C. Williamson,et al.  Support vector regression with automatic accuracy control. , 1998 .

[16]  Alexander J. Smola,et al.  Support Vector Regression Machines , 1996, NIPS.

[17]  Eduardo Bayro-Corrochano,et al.  Geometric neural networks and support multi-vector machines , 2000, Proceedings of the IEEE-INNS-ENNS International Joint Conference on Neural Networks. IJCNN 2000. Neural Computing: New Challenges and Perspectives for the New Millennium.

[18]  Alexander J. Smola,et al.  Support Vector Method for Function Approximation, Regression Estimation and Signal Processing , 1996, NIPS.

[19]  Alistair Shilton,et al.  Mercer's theorem for quaternionic kernels , 2007 .

[20]  Bernhard Schölkopf,et al.  A tutorial on support vector regression , 2004, Stat. Comput..

[21]  B. Mulgrew,et al.  Complex-valued radial basis function network, Part II: Application to digital communications channel equalisation , 1994, Signal Process..