On an adaptive version of the Metropolis-Hastings algorithm with independent proposal distribution

In this paper, we present a general formulation of an algorithm, the adaptive independent chain (AIC), that was introduced in a special context in Gasemyr et al. [Methodol. Comput. Appl. Probab. 3 (2001)]. The algorithm aims at producing samples from a specific target distribution Π, and is an adaptive, non-Markovian version of the Metropolis-Hastings independent chain. A certain parametric class of possible proposal distributions is fixed, and the parameters of the proposal distribution are updated periodically on the basis of the recent history of the chain, thereby obtaining proposals that get ever closer to Π. We show that under certain conditions, the algorithm produces an exact sample from Π in a finite number of iterations, and hence that it converges to II. We also present another adaptive algorithm, the componentwise adaptive independent chain (CAIC), which may be an alternative in particular in high dimensions. The CAIC may be regarded as an adaptive approximation to the Gibbs sampler updating parametric approximations to the conditionals of II.