On the Effect of the Form of the Posterior Approximation in Variational Learning of ICA Models

We show that the choice of posterior approximation affects the solution found in Bayesian variational learning of linear independent component analysis models. Assuming the sources to be independent a posteriori favours a solution which has orthogonal mixing vectors. Linear mixing models with either temporally correlated sources or non-Gaussian source models are considered but the analysis extends to nonlinear mixtures as well.

[1]  Eric Moulines,et al.  A blind source separation technique using second-order statistics , 1997, IEEE Trans. Signal Process..

[2]  Andrzej Cichocki,et al.  On-line Algorithm for Blind Signal Extraction of Arbitrarily Distributed, but Temporally Correlated Sources Using Second Order Statistics , 2000, Neural Processing Letters.

[3]  David J. C. MacKay,et al.  Ensemble Learning for Blind Image Separation and Deconvolution , 2000 .

[4]  R. Liu,et al.  AMUSE: a new blind identification algorithm , 1990, IEEE International Symposium on Circuits and Systems.

[5]  Terrence J. Sejnowski,et al.  Variational Learning for Switching State-Space Models , 2001 .

[6]  Andreas Ziehe,et al.  Artifact Reduction in Magnetoneurography Based on Time-Delayed Second Order Correlations , 1998 .

[7]  Antti Honkela,et al.  Bayes Blocks Software Library , 2003 .

[8]  Yen-Wei Chen,et al.  Ensemble learning for independent component analysis , 2006, Pattern Recognit..

[9]  Stephen J. Roberts,et al.  An ensemble learning approach to independent component analysis , 2000, Neural Networks for Signal Processing X. Proceedings of the 2000 IEEE Signal Processing Society Workshop (Cat. No.00TH8501).

[10]  Ole Winther,et al.  Mean-Field Approaches to Independent Component Analysis , 2002, Neural Computation.

[11]  David Barber,et al.  Ensemble Learning for Multi-Layer Networks , 1997, NIPS.

[12]  Harri Lappalainen,et al.  Ensemble learning for independent component analysis , 1999 .

[13]  J. W. Miskin,et al.  Ensemble Learning for Blind Source Separation , 2001 .

[14]  Zoubin Ghahramani,et al.  Propagation Algorithms for Variational Bayesian Learning , 2000, NIPS.

[15]  Juha Karhunen,et al.  Hierarchical models of variance sources , 2004, Signal Process..

[16]  Markus Harva,et al.  Hierarchical Variance Models of Image Sequences , 2004 .

[17]  J. Karhunen,et al.  Building Blocks for Hierarchical Latent Variable Models , 2001 .

[18]  Hagai Attias,et al.  Independent Factor Analysis , 1999, Neural Computation.

[19]  Erkki Oja,et al.  Independent Component Analysis , 2001 .

[20]  Juha Karhunen,et al.  An Unsupervised Ensemble Learning Method for Nonlinear Dynamic State-Space Models , 2002, Neural Computation.

[21]  Terrence J. Sejnowski,et al.  Variational Bayesian Learning of ICA with Missing Data , 2003, Neural Computation.

[22]  Harri Valpola Nonlinear independent component analysis using ensemble learning: Theory , 2000 .

[23]  Terrence J. Sejnowski,et al.  Variational Learning of Clusters of Undercomplete Nonsymmetric Independent Components , 2003, J. Mach. Learn. Res..