Efficient Parameter Selection of Support Vector Machines

Support Vector Machine (SVM) has, over the years established itself as an effective method for machine learning. SVM has strengths as such that it uses a kernel function to deal with arbitrary structured data which comprises of non-linear data sets. However, to fully optimize the benefits of using the kernel function, one will have to fine-tune the parameters of SVM in order to achieve feasible results. However, parameter selection can get complicated as the number of parameters and the size of the dataset increases. In this paper, we propose a method to deal with effective parameter selection for SVM for optimal performance through experiments done on heart sound data using the features of IEFE extraction technique.

[1]  John Platt,et al.  Probabilistic Outputs for Support vector Machines and Comparisons to Regularized Likelihood Methods , 1999 .

[2]  Chih-Jen Lin,et al.  A Simple Decomposition Method for Support Vector Machines , 2002, Machine Learning.

[3]  Federico Girosi,et al.  An Equivalence Between Sparse Approximation and Support Vector Machines , 1998, Neural Computation.

[4]  Corinna Cortes,et al.  Support-Vector Networks , 1995, Machine Learning.

[5]  Bernhard Schölkopf,et al.  A tutorial on support vector regression , 2004, Stat. Comput..

[6]  Federico Girosi,et al.  An improved training algorithm for support vector machines , 1997, Neural Networks for Signal Processing VII. Proceedings of the 1997 IEEE Signal Processing Society Workshop.