Vector
暂无分享,去创建一个
. A support vector machine (SVM) is an algorithm that finds a hyperplane which optimally separates labeled data points in R n into positive and negative classes. The data points on the margin of this separating hyperplane are called support vectors . We connect the possible configurations of support vectors to Radon’s theorem, which provides guarantees for when a set of points can be divided into two classes (positive and negative) whose convex hulls intersect. If the convex hulls of the positive and negative support vectors are projected onto a separating hyperplane, then the projections intersect if and only if the hyperplane is optimal. Further, with a particular type of general position, we show that (a) the projected convex hulls of the support vectors intersect in exactly one point, (b) the support vectors are stable under per- turbation, (c) there are at most n + 1 support vectors, and (d) every number of support vectors from 2 up to n + 1 is possible. Finally, we perform com- puter simulations studying the expected number of support vectors, and their configurations, for randomly generated data. We observe that as the distance between classes of points increases for this type of randomly generated data, configurations with fewer support vectors become more likely.
[1] I. Chavel. Eigenvalues in Riemannian geometry , 1984 .
[2] M. Berger,et al. Le Spectre d'une Variete Riemannienne , 1971 .
[3] Salomon Bochner,et al. Curvature and Betti numbers , 1948 .
[4] R.,et al. On the Spectra of Laplace Operator on A * ( S ) By , 2022 .