On characterization of entropy function via information inequalities
暂无分享,去创建一个
The properties of the so-called basic information inequalities of Shannon's information measures are discussed. Do these properties fully characterize the entropy function? To make this question more precise, we view an entropy function as a 2n-1 dimensional vector where the coordinates are indexed by the subsets of the ground set (1, 2, ..., n). The main discovery of this paper is a new information inequality involving 4 discrete random variables which gives a negative answer to this fundamental problem of information theory.
[1] Te Sun Han. Nonnegative Entropy Measures of Multivariate Symmetric Correlations , 1978, Inf. Control..