Separation methods for nonlinear mixtures

This subproject considered ICA and BSS problems for nonlinear data models. In that case, ICA is characterized by severe indeterminacies, which makes the BSS problem ill-posed. Hence some kind of regularization is necessary for actually achieving solutions which can be interpreted as underlying sources. The problem of regularization has been analyzed from different perspectives and resulted in powerful methods. For example, explicit constraints on the model class are used in the case of post-nonlinear mixtures, where the nonlinearity is modeled as a component-wise distortion of linear mixtures. More implicit constraints are implemented in kernelbased methods like kTDSEP, where the choice of the kernel parameters regularizes the complexity of the nonlinearities. Also, exploiting prior information on the sources, for example bounded or temporally correlated sources, reduces the indeterminacies in the solutions and leads to simplified algorithms. Another promising method consists in regularizing the solution using a fully Bayesian variational ensemble learning approach. It tries to find the sources and the mapping that have most probably generated the observed data given the prior. The ensemble learning method allows nonlinear source separation for problems of realistic size, and it can be easily extended in various directions. It has been argued that regularization based on an assumption of smoothness of the mixture could overcome the ill-posedness of nonlinear BSS. Although it is still a matter of debate under what exact conditions this is true, several examples of successful source recovery through the MISEP method, using smoothness for regularization, have been presented. Last but not least, there has been a high degree of cooperation among the partners to compare the results and to promote combined approaches.

[1]  Andreas Ziehe,et al.  TDSEP { an e(cid:14)cient algorithm for blind separation using time structure , 1998 .

[2]  L. Almeida,et al.  ICA of linear and nonlinear mixtures based on mutual information , 2001, IJCNN'01. International Joint Conference on Neural Networks. Proceedings (Cat. No.01CH37222).

[3]  L. B. Almeida,et al.  MISEP-an ICA method for linear and nonlinear mixtures, based on mutual information , 2002, Proceedings of the 2002 International Joint Conference on Neural Networks. IJCNN'02 (Cat. No.02CH37290).

[4]  S. Hyakin,et al.  Neural Networks: A Comprehensive Foundation , 1994 .

[5]  Christian Jutten,et al.  Quasi-nonparametric blind inversion of Wiener systems , 2001, IEEE Trans. Signal Process..

[6]  Motoaki Kawanabe,et al.  Kernel Feature Spaces and Nonlinear Blind Souce Separation , 2001, NIPS.

[7]  KawanabeMotoaki,et al.  Blind separation of post-nonlinear mixtures using linearizing transformations and temporal decorrelation , 2003 .

[8]  Aapo Hyvärinen,et al.  Nonlinear independent component analysis: Existence and uniqueness results , 1999, Neural Networks.

[9]  Christian Jutten,et al.  A geometric approach for separating post non-linear mixtures , 2002, 2002 11th European Signal Processing Conference.

[10]  Gunnar Rätsch,et al.  An introduction to kernel-based learning algorithms , 2001, IEEE Trans. Neural Networks.

[11]  Christian Jutten,et al.  Parametric approach to blind deconvolution of nonlinear channels , 2002, ESANN.

[12]  Hagai Attias,et al.  Independent Factor Analysis , 1999, Neural Computation.

[13]  T. Sejnowski,et al.  Separation of post-nonlinear mixtures using ACE and temporal decorrelation , 2001 .

[14]  최승진 Nonlinear dynamic independent component analysis using state-space and neural network models , 1999 .

[15]  Juha Karhunen,et al.  Accelerating Cyclic Update Algorithms for Parameter Estimation by Pattern Searches , 2003, Neural Processing Letters.

[16]  Juha Karhunen,et al.  Final Technical Report on BSS Models and Methods for Non-Independent BSS , 2003 .

[17]  Harri Lappalainen,et al.  Ensemble learning for independent component analysis , 1999 .

[18]  Bernhard Schölkopf,et al.  Nonlinear Component Analysis as a Kernel Eigenvalue Problem , 1998, Neural Computation.

[19]  Luís B. Almeida,et al.  Linear and nonlinear ICA based on mutual information - the MISEP method , 2004, Signal Process..

[20]  Dimitrios Hatzinakos,et al.  Blind identification of LTI-ZMNL-LTI nonlinear channel models , 1995, IEEE Trans. Signal Process..

[21]  A. Honkela Speeding up cyclic update schemes by pattern searches , 2002, Proceedings of the 9th International Conference on Neural Information Processing, 2002. ICONIP '02..

[22]  Juha Karhunen,et al.  An Unsupervised Ensemble Learning Method for Nonlinear Dynamic State-Space Models , 2002, Neural Computation.

[23]  Erkki Oja,et al.  DYNAMICAL FACTOR ANALYSIS OF RHYTHMIC MAGNETOENCEPHALOGRAPHIC ACTIVITY , 2001 .

[24]  Juha Karhunen,et al.  An ensemble learning approach to nonlinear dynamic blind source separation using state-space models , 2002, Proceedings of the 2002 International Joint Conference on Neural Networks. IJCNN'02 (Cat. No.02CH37290).

[25]  Constance de Koning,et al.  Editors , 2003, Annals of Emergency Medicine.

[26]  Juha Karhunen,et al.  Missing Values in Hierarchical Nonlinear Factor Analysis , 2003 .

[27]  Juha Karhunen,et al.  Hierarchical models of variance sources , 2004, Signal Process..

[28]  J. Karhunen,et al.  Advances in Nonlinear Blind Source Separation , 2003 .

[29]  Andrzej Cichocki,et al.  Adaptive Blind Signal and Image Processing - Learning Algorithms and Applications , 2002 .

[30]  J. C. BurgesChristopher A Tutorial on Support Vector Machines for Pattern Recognition , 1998 .

[31]  Erkki Oja,et al.  Nonlinear dynamical factor analysis for state change detection , 2004, IEEE Transactions on Neural Networks.

[32]  Alexander Ilin,et al.  On the Effect of the Form of the Posterior Approximation in Variational Learning of ICA Models , 2005, Neural Processing Letters.

[33]  Jan Eriksson,et al.  Novel Characteristic Function Based Criteria For Ica , 2001 .

[34]  M. Girolami,et al.  Advances in Independent Component Analysis , 2000, Perspectives in Neural Computing.

[35]  Andreas Ziehe,et al.  Blind separation of post-nonlinear mixtures using gaussianizing transformations and temporal decorrelation , 2003 .

[36]  C. Jutten,et al.  QUADRATIC DEPENDENCE MEASURE FOR NONLINEAR BLIND SOURCES SEPARATION , 2003 .

[37]  Luís B. Almeida,et al.  MISEP -- Linear and Nonlinear ICA Based on Mutual Information , 2003, J. Mach. Learn. Res..

[38]  Christian Jutten,et al.  Nonlinear source separation: the post-nonlinear mixtures , 1997, ESANN.

[39]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[40]  Harri Valpola Nonlinear independent component analysis using ensemble learning: Theory , 2000 .

[41]  Vladimir N. Vapnik,et al.  The Nature of Statistical Learning Theory , 2000, Statistics for Engineering and Information Science.

[42]  E. Oja,et al.  Nonlinear Blind Source Separation by Variational Bayesian Learning , 2003, IEICE Trans. Fundam. Electron. Commun. Comput. Sci..

[43]  Jean-François Cardoso,et al.  Equivariant adaptive source separation , 1996, IEEE Trans. Signal Process..

[44]  C. Jutten,et al.  On the separability of nonlinear mixtures of temporally correlated sources , 2003, IEEE Signal Processing Letters.

[45]  Dinh Tuan Pham,et al.  BLIND SOURCE SEPARATION IN POST NONLINEAR MIXTURES , 2001 .

[46]  Massoud Babaiezadeh Malmiri On blind source separation in convolutive and nonlinear mixtures , 2002 .

[47]  Antti Honkela,et al.  Variational learning and bits-back coding: an information-theoretic view to Bayesian learning , 2004, IEEE Transactions on Neural Networks.

[48]  Christian Jutten,et al.  BLIND SEPARATING CONVOLUTIVE POST NON-LINEAR MIXTURES , 2001 .

[49]  Juha Karhunen,et al.  Nonlinear Independent Component Analysis Using Ensemble Learning: Experiments and Discussion , 2000 .

[50]  J. Karhunen,et al.  Building Blocks for Hierarchical Latent Variable Models , 2001 .

[51]  J. Karhunen,et al.  Nonlinear Independent Factor Analysis by Hierarchical Models , 2003 .

[52]  R. A. Redol FASTER TRAINING IN NONLINEAR ICA USING MISEP , 2003 .

[53]  Erkki Oja,et al.  DETECTING PROCESS STATE CHANGES BY NONLINEAR BLIND SOURCE SEPARATION , 2003 .

[54]  Pierre-Olivier Amblard,et al.  Blind Equalization of a Nonlinear Satellite System Using MCMC Simulation Methods , 2002, EURASIP J. Adv. Signal Process..

[55]  Gunnar Rätsch,et al.  Input space versus feature space in kernel-based methods , 1999, IEEE Trans. Neural Networks.

[56]  H. Valpola,et al.  Using Kernel PCA for Initialisation of Nonlinear Factor Analysis , 2003 .

[57]  Christian Jutten,et al.  Blind Inversion of Wiener System using a minimization-projection (MP) approach , 2003 .

[58]  Dinh-Tuan Pham,et al.  Improving algorithm speed in PNL mixture separation and Wiener system inversion , 2003 .

[59]  Christian Jutten,et al.  Source separation in post-nonlinear mixtures , 1999, IEEE Trans. Signal Process..

[60]  William S. Rayens,et al.  Independent Component Analysis: Principles and Practice , 2003, Technometrics.

[61]  Alexander J. Smola,et al.  Learning with kernels , 1998 .

[62]  Heekuck Oh,et al.  Neural Networks for Pattern Recognition , 1993, Adv. Comput..

[63]  Adrian M. Ionescu,et al.  High Performance Magnetic Field Smart Sensor Arrays with Source Separation , 1998 .

[64]  Motoaki Kawanabe,et al.  Kernel-Based Nonlinear Blind Source Separation , 2003, Neural Computation.

[65]  Dustin Boswell,et al.  Introduction to Support Vector Machines , 2002 .

[66]  Andrzej Cichocki,et al.  Robust learning algorithm for blind separation of signals , 1994 .

[67]  M. J. Korenberg,et al.  The identification of nonlinear biological systems: LNL cascade models , 1986, Biological Cybernetics.

[68]  Andrzej Cichocki,et al.  A New Learning Algorithm for Blind Signal Separation , 1995, NIPS.