An efficient algorithm for Kernel two-dimensional principal component analysis

Recently, a new approach called two-dimensional principal component analysis (2DPCA) has been proposed for face representation and recognition. The essence of 2DPCA is that it computes the eigenvectors of the so-called image covariance matrix without matrix-to-vector conversion. Kernel principal component analysis (KPCA) is a non-linear generation of the popular principal component analysis via the Kernel trick. Similarly, the Kernelization of 2DPCA can be benefit to develop the non-linear structures in the input data. However, the standard K2DPCA always suffers from the computational problem for using the image matrix directly. In this paper, we propose an efficient algorithm to speed up the training procedure of K2DPCA. The results of experiments on face recognition show that the proposed algorithm can achieve much more computational efficiency and remarkably save the memory-consuming compared to the standard K2DPCA.

[1]  Ming-Hsuan Yang,et al.  Kernel Eigenfaces vs. Kernel Fisherfaces: Face recognition using kernel methods , 2002, Proceedings of Fifth IEEE International Conference on Automatic Face Gesture Recognition.

[2]  Bernhard Schölkopf,et al.  Nonlinear Component Analysis as a Kernel Eigenvalue Problem , 1998, Neural Computation.

[3]  Roman Rosipal,et al.  An Expectation-Maximization Approach to Nonlinear Component Analysis , 2001, Neural Computation.

[4]  M. Turk,et al.  Eigenfaces for Recognition , 1991, Journal of Cognitive Neuroscience.

[5]  Lawrence Sirovich,et al.  Application of the Karhunen-Loeve Procedure for the Characterization of Human Faces , 1990, IEEE Trans. Pattern Anal. Mach. Intell..

[6]  Alejandro F. Frangi,et al.  Two-dimensional PCA: a new approach to appearance-based face representation and recognition , 2004 .

[7]  Wenming Zheng,et al.  An Improved Algorithm for Kernel Principal Component Analysis , 2005, Neural Processing Letters.