Building Blocks for Hierarchical Latent Variable Models

We introduce building blocks from which a large variety of latent variable models can be built. The blocks include continuous and discrete variables, summation, addition, nonlinearity and switching. Ensemble learning provides a cost function which can be used for updating the variables as well as optimising the model structure. The blocks are designed to fit together and to yield efficient update rules. Emphasis is on local computation which results in linear computational complexity. We propose and test a structure with a hierachical nonlinear model for variances and means.

[1]  Geoffrey E. Hinton,et al.  Keeping the neural networks simple by minimizing the description length of the weights , 1993, COLT '93.

[2]  R. Zemel,et al.  Learning sparse multiple cause models , 2000, Proceedings 15th International Conference on Pattern Recognition. ICPR-2000.

[3]  Geoffrey E. Hinton,et al.  Hierarchical Non-linear Factor Analysis and Topographic Maps , 1997, NIPS.

[4]  Samuel Kaski,et al.  Self-Organized Formation of Various Invariant-Feature Filters in the Adaptive-Subspace SOM , 1997, Neural Computation.

[5]  Zoubin Ghahramani,et al.  Learning Nonlinear Dynamical Systems Using an EM Algorithm , 1998, NIPS.

[6]  Jean-François Cardoso,et al.  Multidimensional independent component analysis , 1998, Proceedings of the 1998 IEEE International Conference on Acoustics, Speech and Signal Processing, ICASSP '98 (Cat. No.98CH36181).

[7]  Aapo Hyvärinen,et al.  Emergence of Topography and Complex Cell Properties from Natural Images using Extensions of ICA , 1999, NIPS.

[8]  Hagai Attias,et al.  Independent Factor Analysis , 1999, Neural Computation.

[9]  Brendan J. Frey,et al.  Variational Learning in Nonlinear Gaussian Belief Networks , 1999, Neural Computation.

[10]  Kevin P. Murphy,et al.  A Variational Approximation for Bayesian Networks with Discrete and Continuous Latent Variables , 1999, UAI.

[11]  Antti Honkela,et al.  Bayesian Non-Linear Independent Component Analysis by Multi-Layer Perceptrons , 2000 .

[12]  Nikunj C. Oza,et al.  Online Ensemble Learning , 2000, AAAI/IAAI.

[13]  Aapo Hyvärinen,et al.  Emergence of Phase- and Shift-Invariant Features by Decomposition of Natural Images into Independent Feature Subspaces , 2000, Neural Computation.

[14]  Dinh-Tuan Pham,et al.  Blind separation of instantaneous mixtures of nonstationary sources , 2001, IEEE Trans. Signal Process..