Feasible Adaptation Criteria for Hybrid Wavelet - Large Margin Classifiers

In the context of signal classification, this paper assembles and compares criteria to easily judge the discrimination quality of a set of feature vectors. The quality measures are based on the assumption that a Support Vector Machine is used for the final classification. Thus, the ultimate criterion is a large margin separating the two classes. We apply the criteria to control the feature extraction process for signal classification. Adaptive features related to the shape of the signals are extracted by wavelet filtering followed by a nonlinear map. To be able to test many features, the criteria are easily computable while still reliably predicting the classification performance. We also present a novel approach for computing the radius of a set of points in feature space. The radius, in relation to the margin, forms the most commonly used error bound for Support Vector Machines. For isotropic kernels, the problem of radius computation can be reduced to a common Support Vector Machine classification problem.

[1]  R. Fletcher Practical Methods of Optimization , 1988 .

[2]  Sally Floyd,et al.  Sample compression, learnability, and the Vapnik-Chervonenkis dimension , 2004, Machine Learning.

[3]  S. Sathiya Keerthi,et al.  Evaluation of simple performance measures for tuning SVM hyperparameters , 2003, Neurocomputing.

[4]  Hava T. Siegelmann,et al.  A Support Vector Method for Clustering , 2000, NIPS.

[5]  Thorsten Joachims,et al.  Making large scale SVM learning practical , 1998 .

[6]  Vladimir N. Vapnik,et al.  The Nature of Statistical Learning Theory , 2000, Statistics for Engineering and Information Science.

[7]  Bernhard Schölkopf,et al.  Comparing support vector machines with Gaussian kernels to radial basis function classifiers , 1997, IEEE Trans. Signal Process..

[8]  Vladimir Vapnik,et al.  Statistical learning theory , 1998 .

[9]  J. M. Hans du Buf,et al.  A review of recent texture segmentation and feature extraction techniques , 1993 .

[10]  Michael Unser,et al.  Texture classification and segmentation using wavelet frames , 1995, IEEE Trans. Image Process..

[11]  Federico Girosi,et al.  An Equivalence Between Sparse Approximation and Support Vector Machines , 1998, Neural Computation.

[12]  Paul Scheunders,et al.  Wavelet-based Texture Analysis , 1998 .

[13]  Lawrence Carin,et al.  Genetic Algorithm Wavelet Design for Signal Classification , 2001, IEEE Trans. Pattern Anal. Mach. Intell..

[14]  Ralf Herbrich,et al.  Learning Kernel Classifiers: Theory and Algorithms , 2001 .

[15]  Nello Cristianini,et al.  An Introduction to Support Vector Machines and Other Kernel-based Learning Methods , 2000 .

[16]  Thore Graepel,et al.  A PAC-Bayesian Margin Bound for Linear Classifiers: Why SVMs work , 2000, NIPS.

[17]  Bernhard Schölkopf,et al.  Estimating the Support of a High-Dimensional Distribution , 2001, Neural Computation.

[18]  G. Wahba Support vector machines, reproducing kernel Hilbert spaces, and randomized GACV , 1999 .

[19]  G. Steidl,et al.  Hybrid wavelet-support vector classification of waveforms , 2002 .

[20]  André Elisseeff,et al.  Algorithmic Stability and Generalization Performance , 2000, NIPS.

[21]  Truong Q. Nguyen,et al.  Wavelets and filter banks , 1996 .

[22]  P. Vaidyanathan Multirate Systems And Filter Banks , 1992 .

[23]  R. Schaback Creating Surfaces from Scattered Data Using Radial Basis Functions , 1995 .

[24]  Ingo Steinwart,et al.  On the Influence of the Kernel on the Generalization Ability of Support Vector Machines , 2001 .

[25]  Nello Cristianini,et al.  Learning the Kernel Matrix with Semidefinite Programming , 2002, J. Mach. Learn. Res..

[26]  Bernhard Schölkopf,et al.  Nonlinear Component Analysis as a Kernel Eigenvalue Problem , 1998, Neural Computation.

[27]  S. Mallat A wavelet tour of signal processing , 1998 .

[28]  Trygve Randen,et al.  Filtering for Texture Classification: A Comparative Study , 1999, IEEE Trans. Pattern Anal. Mach. Intell..

[29]  Eero P. Simoncelli,et al.  A Parametric Texture Model Based on Joint Statistics of Complex Wavelet Coefficients , 2000, International Journal of Computer Vision.

[30]  Sayan Mukherjee,et al.  Choosing Multiple Parameters for Support Vector Machines , 2002, Machine Learning.

[31]  I. Daubechies Orthonormal bases of compactly supported wavelets , 1988 .

[32]  Bernhard Schölkopf,et al.  Support vector learning , 1997 .

[33]  G. Wahba,et al.  Some results on Tchebycheffian spline functions , 1971 .

[34]  R. C. Williamson,et al.  Kernel-dependent support vector error bounds , 1999 .

[35]  R ReedTodd,et al.  A review of recent texture segmentation and feature extraction techniques , 1993 .

[36]  Nello Cristianini,et al.  An introduction to Support Vector Machines , 2000 .

[37]  I. Daubechies Ten Lectures on Wavelets , 1992 .

[38]  Corinna Cortes,et al.  Support-Vector Networks , 1995, Machine Learning.

[39]  Bernhard Schölkopf,et al.  Support Vector Method for Novelty Detection , 1999, NIPS.

[40]  David G. Stork,et al.  Pattern classification, 2nd Edition , 2000 .

[41]  R. Fisher THE USE OF MULTIPLE MEASUREMENTS IN TAXONOMIC PROBLEMS , 1936 .

[42]  O. Mangasarian,et al.  Mathematical programming approaches to machine learning and data mining , 1998 .

[43]  Robert Azencott,et al.  Texture Classification Using Windowed Fourier Filters , 1997, IEEE Trans. Pattern Anal. Mach. Intell..

[44]  William E. Higgins,et al.  Texture Segmentation using 2-D Gabor Elementary Functions , 1994, IEEE Trans. Pattern Anal. Mach. Intell..

[45]  Gabriele Steidl,et al.  Effectively Finding the Optimal Wavelet for Hybrid Wavelet - Large Margin Signal Classification , 2003 .

[46]  Sayan Mukherjee,et al.  Feature Selection for SVMs , 2000, NIPS.

[47]  N. Cristianini,et al.  On Kernel-Target Alignment , 2001, NIPS.