An Improved Domain Multiple Kernel Support Vector Machine

In support vector machine (SVM), it is critical to define the kernel function and a different kernel would cause different classification accuracy. People have started pursuing how to make the most use of multiple kernels harmoniously to improve the SVM performance, hence, the multiple kernel learning (MKL). Recently, an efficient generalized multiple kernel learning (GMKL) method was presented, which combines the advantages of L1-norm and L2-norm. However, the GMKL algorithm does not make the most use of the common information among the selected kernels. On the other hand, the MultiK-MHKS algorithm uses the canonical correlation analysis (CCA) to get the common information among the kernels while ignoring the selecting of kernels. So this paper tries to combine them and an improved domain multiple kernel support vector machine (IDMK-SVM) is presented. Simulation experiments demonstrate that the IDMK-SVM gets a higher classification precision than the existing typical MKL algorithms.