On Bayesian selection of the best normal population using theKullback–Leibler divergence measure

In this paper, we use the Bayesian approach to study the problem of selecting the best population among k different populations π1, ..., πk (k≥2) relative to some standard (or control) population π0. Here, π0 is considered to be the population with the desired characteristics. The best population is defined to be the one which is closest to the ideal population π0 . The procedure uses the idea of minimizing the posterior expected value of the Kullback–Leibler (KL) divergence measure of πi from π0. The populations under consideration are assumed to be multivariate normal. An application to regression problems is also presented. Finally, a numerical example using real data set is provided to illustrate the implementation of the selection procedure.