Parallel Support Vector Data Description

This paper proposes an extension of Support Vector Data Description (SVDD) to provide a better data description. The extension is called Distant SVDD (DSVDD) that determines a smallest hypersphere enclosing all normal (positive) samples as seen in SVDD. In addition, DSVDD maximises the distance from centre of that hypersphere to the origin. When some abnormal (negative) samples are introduced, the DSVDD is extended to Parallel SVDD that also determines a smallest hypersphere for normal samples and at the same time determines a smallest hyperphere for abnormal samples and maximises the distance between centres of these two hyperspheres. Experimental results for classification show that the proposed extensions provide higher accuracy than the original SVDD.

[1]  Dae-Won Kim,et al.  Density-Induced Support Vector Data Description , 2007, IEEE Transactions on Neural Networks.

[2]  Robert P. W. Duin,et al.  Support vector domain description , 1999, Pattern Recognit. Lett..

[3]  Leon N. Cooper,et al.  Pattern Classification via Single Spheres , 2005, Discovery Science.

[4]  Robert P. W. Duin,et al.  Support Vector Data Description , 2004, Machine Learning.

[5]  Bernhard Schölkopf,et al.  Estimating the Support of a High-Dimensional Distribution , 2001, Neural Computation.

[6]  Chih-Jen Lin,et al.  LIBSVM: A library for support vector machines , 2011, TIST.