Maximum-entropy distributions having prescribed first and second moments (Corresp.)
暂无分享,去创建一个
The entropy H of an absolutely continuous distribution with probability density function p(x) is defined as H = - \int p(x) \log p(x) dx . The formal maximization of H , subject to the moment constraints \int x^r p(x) dx = \mu_r, r = 0,1,\cdots,m , leads to p(x) = \exp (- \sum_{r=0}^m \lamnbda_r x^r) , where the \lambda_r have to be chosen so as to satisfy the moment constraints. Only the case m = 2 is considered. It is shown that when x has finite range, a distribution maximizing the entropy exists and is unique. When the range is [0,\infty) , the maximum-entropy distribution exists if, and only if, \mu_2 \leq 2 \mu_1^2 , and a table is given which enables the maximum-entropy distribution to be computed. The case \mu_2 > 2 \mu_1^2 is discussed in some detail.
[1] A. Wragg,et al. Fitting continuous probability density functions over [0,∞) using information theory ideas (Corresp.) , 1970, IEEE Trans. Inf. Theory.
[2] A. Cohen,et al. On Some Functions Involving Mill's Ratio , 1954 .