NESTING EM ALGORITHMS FOR COMPUTATIONAL EFFICIENCY

Computing posterior modes (e.g., maximum likelihood estimates) for models involving latent variables or missing data often involves complicated opti- mization procedures. By splitting this task into two simpler parts, however, EM- type algorithms often offer a simple solution. Although this approach has proven useful, in some settings even these simpler tasks are challenging. In particular, computations involving latent variables are typically difficult to simplify. Thus, in models such as hierarchical models with complicated latent variable structures, computationally intensive methods may be required for the expectation step of EM. This paper describes how nesting two or more EM algorithms can take advan- tage of closed form conditional expectations and lead to algorithms which converge faster, are straightforward to implement, and enjoy stable convergence properties. Methodology to monitor convergence of nested EM algorithms is developed using importance and bridge sampling. The strategy is applied to hierarchical probit and t regression models to derive algorithms which incorporate aspects of Monte-Carlo EM, PX-EM, and nesting in order to combine computational efficiency with easy

[1]  R. Potthoff,et al.  A generalized multivariate analysis of variance model useful especially for growth curve problems , 1964 .

[2]  New York Dover,et al.  ON THE CONVERGENCE PROPERTIES OF THE EM ALGORITHM , 1983 .

[3]  G. C. Wei,et al.  A Monte Carlo Implementation of the EM Algorithm and the Poor Man's Data Augmentation Algorithms , 1990 .

[4]  Alfred O. Hero,et al.  Space-alternating generalized expectation-maximization algorithm , 1994, IEEE Trans. Signal Process..

[5]  Xiao-Li Meng,et al.  On the global and componentwise rates of convergence of the EM algorithm , 1994 .

[6]  C. McCulloch Maximum Likelihood Variance Components Estimation for Binary Data , 1994 .

[7]  Jun S. Liu,et al.  Covariance Structure and Convergence Rate of the Gibbs Sampler with Various Scans , 1995 .

[8]  K. Chan,et al.  Monte Carlo EM Estimation for Time Series Models Involving Counts , 1995 .

[9]  Xiao-Li Meng,et al.  Fitting Full-Information Item Factor Models and an Empirical Investigation of Bridge Sampling , 1996 .

[10]  G. McLachlan,et al.  The EM algorithm and extensions , 1996 .

[11]  Xiao-Li Meng,et al.  SIMULATING RATIOS OF NORMALIZING CONSTANTS VIA A SIMPLE IDENTITY: A THEORETICAL EXPLORATION , 1996 .

[12]  G. Roberts,et al.  Updating Schemes, Correlation Structure, Blocking and Parameterization for the Gibbs Sampler , 1997 .

[13]  Xiao-Li Meng,et al.  The EM Algorithm—an Old Folk‐song Sung to a Fast New Tune , 1997 .

[14]  A. Kuk,et al.  MAXIMUM LIKELIHOOD ESTIMATION FOR PROBIT-LINEAR MIXED MODELS WITH CORRELATED RANDOM EFFECTS , 1997 .

[15]  C. McCulloch Maximum Likelihood Algorithms for Generalized Linear Mixed Models , 1997 .

[16]  Xiao-Li Meng,et al.  Fast EM‐type implementations for mixed effects models , 1998 .

[17]  D. Rubin,et al.  Parameter expansion to accelerate EM: The PX-EM algorithm , 1998 .

[18]  Xiao-Li Meng,et al.  Seeking efficient data augmentation schemes via conditional and marginal augmentation , 1999 .

[19]  D. V. Dyk,et al.  Fitting Mixed-Effects Models Using Efficient EM-Type Algorithms , 2000 .

[20]  David A. van Dyk,et al.  Fitting Mixed-Effects Models Using Efficient EM-Type Algorithms , 2000 .

[21]  Xiao-Li Meng,et al.  The Art of Data Augmentation , 2001 .

[22]  George Casella,et al.  Implementations of the Monte Carlo EM Algorithm , 2001 .

[23]  Analysis of energy spectra with low photon counts via Bayesian posterior simulation , 2001, astro-ph/0008170.