Ensemble of Independent Factor Analyzers with Application to Natural Image Analysis

In this paper the ensemble of independent factor analyzers (EIFA) is proposed. This new statistical model assumes that each data point is generated by the sum of outputs of independently activated factor analyzers. A maximum likelihood (ML) estimation algorithm for the parameter is derived using a Monte Carlo EM algorithm with a Gibbs sampler. The EIFA model is applied to natural image data. With the progress of the learning, the independent factor analyzers develop into feature detectors that resemble complex cells in mammalian visual systems. Although this result is similar to the previous one obtained by independent subspace analysis, we observe the emergence of complex cells from natural images in a more general framework of models, including overcomplete models allowing additive noise in the observables.

[1]  A. E. Maxwell,et al.  Factor Analysis as a Statistical Method. , 1964 .

[2]  Dorothy T. Thayer,et al.  EM algorithms for ML factor analysis , 1982 .

[3]  Andrew L. Rukhin,et al.  Tools for statistical inference , 1991 .

[4]  Terrence J. Sejnowski,et al.  An Information-Maximization Approach to Blind Separation and Blind Deconvolution , 1995, Neural Computation.

[5]  Teuvo Kohonen,et al.  Self-Organizing Maps , 2010 .

[6]  Geoffrey E. Hinton,et al.  The EM algorithm for mixtures of factor analyzers , 1996 .

[7]  David J. Field,et al.  Emergence of simple-cell receptive field properties by learning a sparse code for natural images , 1996, Nature.

[8]  Geoffrey E. Hinton,et al.  Modeling the manifolds of images of handwritten digits , 1997, IEEE Trans. Neural Networks.

[9]  Terrence J. Sejnowski,et al.  The “independent components” of natural scenes are edge filters , 1997, Vision Research.

[10]  Sam T. Roweis,et al.  EM Algorithms for PCA and SPCA , 1997, NIPS.

[11]  Jean-François Cardoso,et al.  Multidimensional independent component analysis , 1998, Proceedings of the 1998 IEEE International Conference on Acoustics, Speech and Signal Processing, ICASSP '98 (Cat. No.98CH36181).

[12]  M. Clyde,et al.  Multiple shrinkage and subset selection in wavelets , 1998 .

[13]  Aapo Hyvärinen,et al.  Emergence of Topography and Complex Cell Properties from Natural Images using Extensions of ICA , 1999, NIPS.

[14]  K. Jarrod Millman,et al.  Learning Sparse Codes with a Mixture-of-Gaussians Prior , 1999, NIPS.

[15]  Hagai Attias,et al.  Independent Factor Analysis , 1999, Neural Computation.

[16]  Christopher M. Bishop,et al.  Mixtures of Probabilistic Principal Component Analyzers , 1999, Neural Computation.

[17]  Bruno A. Olshausen,et al.  PROBABILISTIC FRAMEWORK FOR THE ADAPTATION AND COMPARISON OF IMAGE CODES , 1999 .

[18]  Terrence J. Sejnowski,et al.  Learning Overcomplete Representations , 2000, Neural Computation.

[19]  Aapo Hyvärinen,et al.  Emergence of Phase- and Shift-Invariant Features by Decomposition of Natural Images into Independent Feature Subspaces , 2000, Neural Computation.

[20]  A. Utsugi,et al.  Bayesian Analysis of Mixtures of Factor Analyzers , 2001, Neural Computation.

[21]  Michael I. Jordan,et al.  Factorial Hidden Markov Models , 1995, Machine Learning.

[22]  Amos Storkey,et al.  Advances in Neural Information Processing Systems 20 , 2007 .