A Constrained EM Algorithm for Independent Component Analysis

We introduce a novel way of performing independent component analysis using a constrained version of the expectation-maximization (EM) algorithm. The source distributions are modeled as D one-dimensional mixtures of gaussians. The observed data are modeled as linear mixtures of the sources with additive, isotropic noise. This generative model is fit to the data using constrained EM. The simpler soft-switching approach is introduced, which uses only one parameter to decide on the sub- or supergaussian nature of the sources. We explain how our approach relates to independent factor analysis.

[1]  William H. Press,et al.  Numerical Recipes in FORTRAN - The Art of Scientific Computing, 2nd Edition , 1987 .

[2]  Christian Jutten,et al.  Blind separation of sources, part I: An adaptive algorithm based on neuromimetic architecture , 1991, Signal Process..

[3]  Pierre Comon,et al.  Independent component analysis, A new concept? , 1994, Signal Process..

[4]  Terrence J. Sejnowski,et al.  An Information-Maximization Approach to Blind Separation and Blind Deconvolution , 1995, Neural Computation.

[5]  Barak A. Pearlmutter,et al.  A Context-Sensitive Generalization of ICA , 1996 .

[6]  Jean-François Cardoso,et al.  Equivariant adaptive source separation , 1996, IEEE Trans. Signal Process..

[7]  Erkki Oja,et al.  The nonlinear PCA learning rule in independent component analysis , 1997, Neurocomputing.

[8]  Aapo Hyvrinen Independent Component Analysis by Minimization of Mutual Information Independent Component Analysis by Minimization of Mutual Information Independent Component Analysis by Minimization of Mutual Information , 1997 .

[9]  Eric Moulines,et al.  Maximum likelihood for blind separation and deconvolution of noisy signals using mixture models , 1997, 1997 IEEE International Conference on Acoustics, Speech, and Signal Processing.

[10]  S Makeig,et al.  Blind separation of auditory event-related brain responses into independent components. , 1997, Proceedings of the National Academy of Sciences of the United States of America.

[11]  Terrence J. Sejnowski,et al.  The “independent components” of natural scenes are edge filters , 1997, Vision Research.

[12]  Colin Fyfe,et al.  An extended exploratory projection pursuit network with linear and nonlinear anti-hebbian lateral connections applied to the cocktail party problem , 1997, Neural Networks.

[13]  S Makeig,et al.  Spatially independent activity patterns in functional MRI data during the stroop color-naming task. , 1998, Proceedings of the National Academy of Sciences of the United States of America.

[14]  Hagai Attias,et al.  Independent Factor Analysis , 1999, Neural Computation.

[15]  Terrence J. Sejnowski,et al.  Independent Component Analysis Using an Extended Infomax Algorithm for Mixed Subgaussian and Supergaussian Sources , 1999, Neural Computation.

[16]  Michael Weber,et al.  Independent Component Analysis of Incomplete Data , 1999 .

[17]  Terrence J. Sejnowski,et al.  Unsupervised Learning , 2018, Encyclopedia of GIS.

[18]  Michael E. Tipping,et al.  Probabilistic Principal Component Analysis , 1999 .

[19]  Terrence J. Sejnowski,et al.  Learning Overcomplete Representations , 2000, Neural Computation.

[20]  S. Fiori,et al.  Information-theoretic learning for FAN network applied to eterokurtic component analysis , 2002 .

[21]  William H. Press,et al.  Numerical recipes in C , 2002 .