A Fast Algorithm for Updating and Downsizing the Dominant Kernel Principal Components

Many important kernel methods in the machine learning area, such as kernel principal component analysis, feature approximation, denoising, compression, and prediction require the computation of the dominant set of eigenvectors of the symmetric kernel Gram matrix. Recently, an efficient incremental approach was presented for the fast calculation of the dominant kernel eigenbasis. In this paper we propose faster algorithms for incrementally updating and downsizing the dominant kernel eigenbasis. These methods are well-suited for large scale problems since they are efficient in terms of both complexity and data management.

[1]  B. S. Manjunath,et al.  An eigenspace update algorithm for image analysis , 1995, Proceedings of International Symposium on Computer Vision - ISCV.

[2]  Gene H. Golub,et al.  Matrix computations , 1983 .

[3]  Johan A. K. Suykens,et al.  Efficiently updating and tracking the dominant kernel principal components , 2007, Neural Networks.

[4]  Catherine Blake,et al.  UCI Repository of machine learning databases , 1998 .

[5]  V. K. Jayaraman,et al.  Feature extraction and denoising using kernel PCA , 2003 .

[6]  Paul Van Dooren,et al.  Recursive Calculation of Dominant Singular Subspaces , 2003, SIAM J. Matrix Anal. Appl..

[7]  Andrzej Cichocki,et al.  Kernel PCA for Feature Extraction and De-Noising in Nonlinear Regression , 2001, Neural Computing & Applications.

[8]  Roland Badeau,et al.  Sliding window adaptive SVD algorithms , 2004, IEEE Transactions on Signal Processing.

[9]  G. W. Stewart,et al.  An updating algorithm for subspace tracking , 1992, IEEE Trans. Signal Process..

[10]  Gunnar Rätsch,et al.  Kernel PCA and De-Noising in Feature Spaces , 1998, NIPS.

[11]  G. W. Stewart,et al.  Matrix algorithms , 1998 .

[12]  N. Higham COMPUTING A NEAREST SYMMETRIC POSITIVE SEMIDEFINITE MATRIX , 1988 .

[13]  Michael Lindenbaum,et al.  Sequential Karhunen-Loeve basis extraction and its application to images , 1998, Proceedings 1998 International Conference on Image Processing. ICIP98 (Cat. No.98CB36269).

[14]  G. Stewart,et al.  Reorthogonalization and stable algorithms for updating the Gram-Schmidt QR factorization , 1976 .

[15]  Joos Vandewalle,et al.  A Singular Value Decomposition Updating Algorithm for Subspace Tracking , 1992, SIAM J. Matrix Anal. Appl..

[16]  Geert Gins,et al.  Efficient Tracking of the Dominant Eigenspace of a Normalized Kernel Matrix , 2008, Neural Computation.

[17]  Hongyuan Zha,et al.  On Updating Problems in Latent Semantic Indexing , 1997, SIAM J. Sci. Comput..

[18]  J. Bunch,et al.  Updating the singular value decomposition , 1978 .

[19]  Bernhard Schölkopf,et al.  Nonlinear Component Analysis as a Kernel Eigenvalue Problem , 1998, Neural Computation.

[20]  Gunnar Rätsch,et al.  Input space versus feature space in kernel-based methods , 1999, IEEE Trans. Neural Networks.

[21]  Roman Rosipal,et al.  An Expectation-Maximization Approach to Nonlinear Component Analysis , 2001, Neural Computation.

[22]  M. Brand,et al.  Fast low-rank modifications of the thin singular value decomposition , 2006 .

[23]  Raf Vandebril,et al.  A note on the recursive calculation of dominant singular subspaces , 2004, Numerical Algorithms.

[24]  Vladimir N. Vapnik,et al.  The Nature of Statistical Learning Theory , 2000, Statistics for Engineering and Information Science.

[25]  Alexander J. Smola,et al.  Learning with kernels , 1998 .

[26]  S. Hammarling A Note on Modifications to the Givens Plane Rotation , 1974 .

[27]  Gunnar Rätsch,et al.  Kernel PCA pattern reconstruction via approximate pre-images. , 1998 .