The Computational Complexity of Estimating Convergence Time

An important problem in the implementation of Markov Chain Monte Carlo algorithms is to determine the convergence time, or the number of iterations before the chain is close to stationarity. For many Markov chains used in practice this time is not known. Even in cases where the convergence time is known to be polynomial, the theoretical bounds are often too crude to be practical. Thus, practitioners like to carry out some form of statistical analysis in order to assess convergence. This has led to the development of a number of methods known as convergence diagnostics which attempt to diagnose whether the Markov chain is far from stationarity. We study the problem of testing convergence in the following settings and prove that the problem is hard in a computational sense: Given a Markov chain that mixes rapidly, it is hard for Statistical Zero Knowledge (SZK-hard) to distinguish whether starting from a given state, the chain is close to stationarity by time t or far from stationarity at time ct for a constant c. We show the problem is in AM intersect coAM. Second, given a Markov chain that mixes rapidly it is coNP-hard to distinguish whether it is close to stationarity by time t or far from stationarity at time ct for a constant c. The problem is in coAM. Finally, it is PSPACE-complete to distinguish whether the Markov chain is close to stationarity by time t or far from being mixed at time ct for c at least 1.

[1]  Mark Jerrum,et al.  Polynomial-Time Approximation Algorithms for the Ising Model , 1990, SIAM J. Comput..

[2]  Hoon Kim,et al.  Monte Carlo Statistical Methods , 2000, Technometrics.

[3]  C. Robert,et al.  Bayesian Modeling Using WinBUGS , 2009 .

[4]  Peter W. Glynn,et al.  Stationarity detection in the initial transient problem , 1992, TOMC.

[5]  M. Plummer,et al.  CODA: convergence diagnosis and output analysis for MCMC , 2006 .

[6]  Santosh S. Vempala,et al.  Simulated annealing in convex bodies and an O*(n4) volume algorithm , 2006, J. Comput. Syst. Sci..

[7]  A. Rukhin Bayes and Empirical Bayes Methods for Data Analysis , 1997 .

[8]  Santosh S. Vempala,et al.  Fast Algorithms for Logconcave Functions: Sampling, Rounding, Integration and Optimization , 2006, 2006 47th Annual IEEE Symposium on Foundations of Computer Science (FOCS'06).

[9]  Gérard Weisbuch,et al.  Dynamical Phase Transitions in 3-Dimensional Spin Glasses , 1987 .

[10]  Thomas Holenstein,et al.  One-Way Secret-Key Agreement and Applications to Circuit Polarization and Immunization of Public-Key Encryption , 2005, CRYPTO.

[11]  Michael E. Saks,et al.  Randomization and derandomization in space-bounded computation , 1996, Proceedings of Computational Complexity (Formerly Structure in Complexity Theory).

[12]  David J. Aldous,et al.  Lower bounds for covering times for reversible Markov chains and random walks on graphs , 1989 .

[13]  Walter J. Savitch,et al.  Relationships Between Nondeterministic and Deterministic Tape Complexities , 1970, J. Comput. Syst. Sci..

[14]  Santosh S. Vempala,et al.  Simulated annealing in convex bodies and an O*(n/sup 4/) volume algorithm , 2003, 44th Annual IEEE Symposium on Foundations of Computer Science, 2003. Proceedings..

[15]  Stephen P. Brooks,et al.  Assessing Convergence of Markov Chain Monte Carlo Algorithms , 2007 .

[16]  Amit Sahai,et al.  A complete promise problem for statistical zero-knowledge , 1997, Proceedings 38th Annual Symposium on Foundations of Computer Science.

[17]  Alistair Sinclair,et al.  Algorithms for Random Generation and Counting: A Markov Chain Approach , 1993, Progress in Theoretical Computer Science.

[18]  Eric Vigoda,et al.  A polynomial-time approximation algorithm for the permanent of a matrix with non-negative entries , 2001, STOC '01.

[19]  Bradley P. Carlin,et al.  BAYES AND EMPIRICAL BAYES METHODS FOR DATA ANALYSIS , 1996, Stat. Comput..

[20]  Bradley P. Carlin,et al.  Markov Chain Monte Carlo conver-gence diagnostics: a comparative review , 1996 .