Asymptotic behaviour of weighted differential entropies in a Bayesian problem

We consider a Bayesian problem of estimating of probability of success in a series of conditionally independent trials with binary outcomes. We study the asymptotic behaviour of differential entropy for posterior probability density function conditional on $x$ successes after $n$ conditionally independent trials, when $n \to \infty$. It is shown that after an appropriate normalization in cases $x \sim n$ and $x$ $\sim n^\beta$ ($0<\beta<1$) limiting distribution is Gaussian and the differential entropy of standardized RV converges to differential entropy of standard Gaussian random variable. When $x$ or $n-x$ is a constant the limiting distribution in not Gaussian, but still the asymptotic of differential entropy can be found explicitly. Then suppose that one is interested to know whether the coin is fair or not and for large $n$ is interested in the true frequency. To do so the concept of weighted differential entropy introduced in \cite{Belis1968} is used when the frequency $\gamma$ is necessary to emphasize. It was found that the weight in suggested form does not change the asymptotic form of Shannon, Renyi, Tsallis and Fisher entropies, but change the constants. The main term in weighted Fisher Information is changed by some constant which depend on distance between the true frequency and the value we want to emphasize. In third part we derived the weighted versions of Rao-Cram\'er, Bhattacharyya and Kullback inequalities. This result is applied to the Bayesian problem described above. The asymptotic forms of these inequalities are obtained for a particular class of weight functions.