KPCA denoising and the pre-image problem revisited

Kernel principal component analysis (KPCA) is widely used in classification, feature extraction and denoising applications. In the latter it is unavoidable to deal with the pre-image problem which constitutes the most complex step in the whole processing chain. One of the methods to tackle this problem is an iterative solution based on a fixed-point algorithm. An alternative strategy considers an algebraic approach that relies on the solution of an under-determined system of equations. In this work we present a method that uses this algebraic approach to estimate a good starting point to the fixed-point iteration. We will demonstrate that this hybrid solution for the pre-image shows better performance than the other two methods. Further we extend the applicability of KPCA to one-dimensional signals which occur in many signal processing applications. We show that artefact removal from such data can be treated on the same footing as denoising. We finally apply the algorithm to denoise the famous USPS data set and to extract EOG interferences from single channel EEG recordings.

[1]  Gunnar Rätsch,et al.  An introduction to kernel-based learning algorithms , 2001, IEEE Trans. Neural Networks.

[2]  Ivor W. Tsang,et al.  The pre-image problem in kernel methods , 2003, IEEE Transactions on Neural Networks.

[3]  A. Tomé,et al.  On the use of clustering and local singular spectrum analysis to remove ocular artifacts from electroencephalograms , 2005, Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005..

[4]  Bernhard Schölkopf,et al.  Nonlinear Component Analysis as a Kernel Eigenvalue Problem , 1998, Neural Computation.

[5]  Michael Ghil,et al.  ADVANCED SPECTRAL METHODS FOR CLIMATIC TIME SERIES , 2002 .

[6]  Yariv Ephraim,et al.  A signal subspace approach for speech enhancement , 1995, IEEE Trans. Speech Audio Process..

[7]  Bernhard Schölkopf,et al.  Iterative kernel principal component analysis for image modeling , 2005, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[8]  Susanto Rahardja,et al.  Signal subspace speech enhancement for audible noise reduction , 2005, Proceedings. (ICASSP '05). IEEE International Conference on Acoustics, Speech, and Signal Processing, 2005..

[9]  Fabian J. Theis,et al.  Denoising using local projective subspace methods , 2006, Neurocomputing.

[10]  Gunnar Rätsch,et al.  Input space versus feature space in kernel-based methods , 1999, IEEE Trans. Neural Networks.

[11]  Takio Kurita,et al.  Robust De-noising by Kernel PCA , 2002, ICANN.

[12]  Peter Gruber,et al.  Automatic removal of high-amplitude artefacts from single-channel electroencephalograms , 2006, Comput. Methods Programs Biomed..

[13]  Anatoly A. Zhigljavsky,et al.  Analysis of Time Series Structure - SSA and Related Techniques , 2001, Monographs on statistics and applied probability.

[14]  Christopher M. Bishop,et al.  Neural networks for pattern recognition , 1995 .

[15]  J. Gower Adding a point to vector diagrams in multivariate analysis , 1968 .