The complexity of approximating the entropy

The Shannon entropy is a measure of the randomness of a distribution, and plays a central role in statistics, information theory, and data compression. Knowing the entropy of a random source can shed light on the compressibility of data produced by such a source. We consider the complexity of approximating the entropy under various different assumptions on the way the input is presented.

[1]  Ronitt Rubinfeld,et al.  On the learnability of discrete distributions , 1994, STOC '94.

[2]  B. Harris The Statistical Estimation of Entropy in the Non-Parametric Case , 1975 .

[3]  Shang‐keng Ma Calculation of entropy from data of motion , 1981 .

[4]  Ronitt Rubinfeld,et al.  Testing random variables for independence and identity , 2001, Proceedings 2001 IEEE International Conference on Cluster Computing.

[5]  William Bialek,et al.  Entropy and Information in Neural Spike Trains , 1996, cond-mat/9603127.

[6]  Oded Goldreich,et al.  Comparing entropies in statistical zero knowledge with applications to the structure of SZK , 1999, Proceedings. Fourteenth Annual IEEE Conference on Computational Complexity (Formerly: Structure in Complexity Theory Conference) (Cat.No.99CB36317).

[7]  David R. Wolf,et al.  Estimating functions of probability distributions from a finite set of samples. , 1994, Physical review. E, Statistical physics, plasmas, fluids, and related interdisciplinary topics.

[8]  Ronitt Rubinfeld,et al.  Testing that distributions are close , 2000, Proceedings 41st Annual Symposium on Foundations of Computer Science.