Passage to the Limit under the Information and Entropy Signs
暂无分享,去创建一个
The main result of this paper amounts to the following statement: If a sequence of pairs of random variables $(\xi _n ,\eta _n )$ is given and this sequence converges in variation to a pair of random variables $(\xi ,\eta )$, then $\lim _{n \to \infty } I(\xi _n ,\eta _n ) = I(\xi ,\eta )(I(\xi ,\eta ){\text{ is the information of the pair }}\xi ,\eta )$ if and only if the sequence of corresponding information densities is uniformly integrable. A similar result is proved for entropies and for a new concept in information within a probability E of events. Conditions are found for the convergence of these quantities.