Information criteria for the predictive evaluation of bayesian models

As a natural successor of the information criteria AIC and ABIC, information criteria for the Bayes models were developed by evaluating the bias of the log likelihood of the predictive distribution as an estimate of its expected log-likelihood. Considering two specific situations for the true distribution, two information criteria, PIC1 and PIC2 are derived. Linear Gaussian cases are considered in details and the evaluation of the maximum a posteriori estimator is also considered. By a simple example of estimating the signal to noise ratio, it was shown that the PIC2 is a good approximation to the expected log-likelihood in the entire region of the signal to noise ratio. On the other hand, PIC1 performs good only for the smaller values of the variance ratio. For illustration, the problems of trend estimation and seasonal adjustment are considered. Examples show that the hyper-parameters estimated by the new criteria are usually closer to the best ones than those by the ABIC.