Kernel-Based Nonlinear Blind Source Separation

We propose kTDSEP, a kernel-based algorithm for nonlinear blind source separation (BSS). It combines complementary research fields: kernel feature spaces and BSS using temporal information. This yields an efficient algorithm for nonlinear BSS with invertible nonlinearity. Key assumptions are that the kernel feature space is chosen rich enough to approximate the nonlinearity and that signals of interest contain temporal information. Both assumptions are fulfilled for a wide set of real-world applications. The algorithm works as follows: First, the data are (implicitly) mapped to a high (possibly infinite)dimensional kernel feature space. In practice, however, the data form a smaller submanifold in feature spaceeven smaller than the number of training data pointsa fact that has already been used by, for example, reduced set techniques for support vector machines. We propose to adapt to this effective dimension as a preprocessing step and to construct an orthonormal basis of this submanifold. The latter dimension-reduction step is essential for making the subsequent application of BSS methods computationally and numerically tractable. In the reduced space, we use a BSS algorithm that is based on second-order temporal decorrelation. Finally, we propose a selection procedure to obtain the original sources from the extracted nonlinear components automatically. Experiments demonstrate the excellent performance and efficiency of our kTDSEP algorithm for several problems of nonlinear BSS and for more than two sources.

[1]  Schuster,et al.  Separation of a mixture of independent signals using time delayed correlations. , 1994, Physical review letters.

[2]  Bernhard Schölkopf,et al.  Nonlinear Component Analysis as a Kernel Eigenvalue Problem , 1998, Neural Computation.

[3]  Motoaki Kawanabe,et al.  A resampling approach to estimate the stability of one-dimensional or multidimensional independent components , 2002, IEEE Transactions on Biomedical Engineering.

[4]  Christian Jutten,et al.  Source separation in post-nonlinear mixtures , 1999, IEEE Trans. Signal Process..

[5]  Andreas Ziehe,et al.  Blind Source Separation Techniques for Decomposing Event-Related Brain Signals , 2004, Int. J. Bifurc. Chaos.

[6]  J. Rissanen,et al.  Modeling By Shortest Data Description* , 1978, Autom..

[7]  Vladimir N. Vapnik,et al.  The Nature of Statistical Learning Theory , 2000, Statistics for Engineering and Information Science.

[8]  Dinh Tuan Pham,et al.  BLIND SOURCE SEPARATION IN POST NONLINEAR MIXTURES , 2001 .

[9]  Pei Ling Lai,et al.  Ica Using Kernel Canonical Correlation Analysis , 2000 .

[10]  Andreas Ziehe,et al.  TDSEP { an e(cid:14)cient algorithm for blind separation using time structure , 1998 .

[11]  Erkki Oja,et al.  DETECTING PROCESS STATE CHANGES BY NONLINEAR BLIND SOURCE SEPARATION , 2003 .

[12]  Gunnar Rätsch,et al.  An introduction to kernel-based learning algorithms , 2001, IEEE Trans. Neural Networks.

[13]  Gilles Burel,et al.  Blind separation of sources: A nonlinear neural algorithm , 1992, Neural Networks.

[14]  A. Hyvärinen,et al.  Nonlinear Blind Source Separation by Self-Organizing Maps , 1996 .

[15]  Te-Won Lee,et al.  Blind source separation of nonlinear mixing models , 1997, Neural Networks for Signal Processing VII. Proceedings of the 1997 IEEE Signal Processing Society Workshop.

[16]  Antoine Souloumiac,et al.  Jacobi Angles for Simultaneous Diagonalization , 1996, SIAM J. Matrix Anal. Appl..

[17]  Jean-Franois Cardoso High-Order Contrasts for Independent Component Analysis , 1999, Neural Computation.

[18]  Alexander J. Smola,et al.  Learning with kernels , 1998 .

[19]  Matthias W. Seeger,et al.  Using the Nyström Method to Speed Up Kernel Machines , 2000, NIPS.

[20]  Erkki Oja,et al.  Independent Component Analysis , 2001 .

[21]  Antti Honkela,et al.  Bayesian Non-Linear Independent Component Analysis by Multi-Layer Perceptrons , 2000 .

[22]  Eric Moulines,et al.  A blind source separation technique using second-order statistics , 1997, IEEE Trans. Signal Process..

[23]  Katya Scheinberg,et al.  Efficient SVM Training Using Low-Rank Kernel Representations , 2002, J. Mach. Learn. Res..

[24]  Juha Karhunen,et al.  A Maximum Likelihood Approach to Nonlinear Blind Source Separation , 1997, ICANN.

[25]  Terrence J. Sejnowski,et al.  Nonlinear blind source separation using kernel feature spaces , 2001 .

[26]  Jean-Francois Cardoso,et al.  Blind signal separation: statistical principles , 1998, Proc. IEEE.

[27]  Nello Cristianini,et al.  An Introduction to Support Vector Machines and Other Kernel-based Learning Methods , 2000 .

[28]  Motoaki Kawanabe,et al.  Estimating the Reliability of ICA Projections , 2001, NIPS.

[29]  Michael I. Jordan,et al.  Kernel independent component analysis , 2003 .

[30]  T. Sejnowski,et al.  Separation of post-nonlinear mixtures using ACE and temporal decorrelation , 2001 .

[31]  Andrzej Cichocki,et al.  Information-theoretic approach to blind separation of sources in non-linear mixture , 1998, Signal Process..

[32]  Gunnar Rätsch,et al.  Input space versus feature space in kernel-based methods , 1999, IEEE Trans. Neural Networks.

[33]  Nello Cristianini,et al.  An introduction to Support Vector Machines , 2000 .

[34]  Juha Karhunen,et al.  Nonlinear Independent Component Analysis Using Ensemble Learning: Experiments and Discussion , 2000 .

[35]  Jan Awrejcewicz,et al.  Bifurcation and Chaos , 1995 .

[36]  Motoaki Kawanabe,et al.  Kernel Feature Spaces and Nonlinear Blind Souce Separation , 2001, NIPS.

[37]  Aapo Hyvärinen,et al.  Nonlinear independent component analysis: Existence and uniqueness results , 1999, Neural Networks.

[38]  Bernhard Schölkopf,et al.  Sparse Greedy Matrix Approximation for Machine Learning , 2000, International Conference on Machine Learning.

[39]  Juan K. Lin,et al.  Faithful Representation of Separable Distributions , 1997, Neural Computation.