Selective support vector machines

AbstractIn this study we introduce a generalized support vector classification problem: Let Xi, i=1,…,n be mutually exclusive sets of pattern vectors such that all pattern vectors xi,k, k=1,…,|Xi| have the same class label yi. Select only one pattern vector $x_{i,k^{*}}$ from each set Xi such that the margin between the set of selected positive and negative pattern vectors are maximized. This problem is formulated as a quadratic mixed 0-1 programming problem, which is a generalization of the standard support vector classifiers. The quadratic mixed 0-1 formulation is shown to be $\mathcal{NP}$ -hard. An alternative approach is proposed with the free slack concept. Primal and dual formulations are introduced for linear and nonlinear classification. These formulations provide flexibility to the separating hyperplane to identify the pattern vectors with large margin. Iterative elimination and direct selection methods are developed to select such pattern vectors using the alternative formulations. These methods are compared with a naïve method on simulated data. The iterative elimination method is also applied to neural data from a visuomotor categorical discrimination task to classify highly cognitive brain activities.

[1]  Bernhard Schölkopf,et al.  Support vector channel selection in BCI , 2004, IEEE Transactions on Biomedical Engineering.

[2]  Nello Cristianini,et al.  An Introduction to Support Vector Machines and Other Kernel-based Learning Methods , 2000 .

[3]  Nello Cristianini,et al.  Kernel Methods for Pattern Analysis , 2003, ICTAI.

[4]  M. J. Eacott,et al.  The role of monkey inferior parietal cortex in visual discrimination of identity and orientation of shapes , 1991, Behavioural Brain Research.

[5]  Kristin P. Bennett,et al.  Support vector machines: hype or hallelujah? , 2000, SKDD.

[6]  Thorsten Joachims,et al.  Text Categorization with Support Vector Machines: Learning with Many Relevant Features , 1998, ECML.

[7]  Alessandro Verri,et al.  Pattern Recognition with Support Vector Machines , 2002, Lecture Notes in Computer Science.

[8]  Vladimir N. Vapnik,et al.  The Nature of Statistical Learning Theory , 2000, Statistics for Engineering and Information Science.

[9]  Marti A. Hearst Trends & Controversies: Support Vector Machines , 1998, IEEE Intell. Syst..

[10]  Bernhard Schölkopf,et al.  Kernel Methods in Computational Biology , 2005 .

[11]  Soushan Wu,et al.  Credit rating analysis with support vector machines and neural networks: a market comparative study , 2004, Decis. Support Syst..

[12]  David S. Johnson,et al.  Computers and Intractability: A Guide to the Theory of NP-Completeness , 1978 .

[13]  Alessandro Verri,et al.  Pattern recognition with support vector machines : First International Workshop, SVM 2002, Niagara Falls, Canada, August 10, 2002 : proceedings , 2002 .

[14]  J. A. Horel,et al.  Visual discrimination impaired by cutting temporal lobe connections. , 1976, Science.

[15]  S. Bressler,et al.  Large-scale visuomotor integration in the cerebral cortex. , 2007, Cerebral cortex.

[16]  D Haussler,et al.  Knowledge-based analysis of microarray gene expression data by using support vector machines. , 2000, Proceedings of the National Academy of Sciences of the United States of America.

[17]  Thomas G. Dietterich,et al.  Solving the Multiple Instance Problem with Axis-Parallel Rectangles , 1997, Artif. Intell..

[18]  S. Corkin,et al.  Visual discrimination and attention after bilateral temporal-lobe lesions: a case study , 1998, Neuropsychologia.

[19]  Theodore B. Trafalis,et al.  Support vector machine for regression and applications to financial forecasting , 2000, Proceedings of the IEEE-INNS-ENNS International Joint Conference on Neural Networks. IJCNN 2000. Neural Computing: New Challenges and Perspectives for the New Millennium.