Kernel Methods for Pattern Analysis

The lectures will introduce the kernel methods approach to pattern analysis [1] through the particular example of support vector machines for classification. The presentation touches on: generalization, optimization, dual representation, kernel design and algorithmic implementations. We then broaden the discussion to consider general kernel methods by introducing different kernels, different learning tasks, and subspace methods such as kernel PCA. The emphasis is on the flexibility of the approach in applying the analyses to different data, with the caveat that the design of the kernel must rely on domain knowledge. Nonetheless we will argue that, ignoring the technical requirement of positive semi-definiteness, kernel design is not an unnatural task for a practitioner. The overall aim is to give a view of the subject that will enable newcomers to the field to gain their bearings so that they can move to apply or develop the techniques for their particular application.