On the entropy of a function
暂无分享,去创建一个
A common statement made when discussing the efficiency of compression programs like JPEG is that the transformations used, the discrete cosine or wavelet transform, decorrelate the data. The standard measure used for the information content of the data is the probabilistic entropy. The data can, in this case, be considered as the sampled values of a function. However no sampling independent definition of the entropy of a function has been proposed. Such a definition is given and it is shown that the entropy so defined is the same as the entropy of the sampled data in the limit as the sample spacing goes to zero.
[1] G. Lorentz. Approximation of Functions , 1966 .
[2] Sang Joon Kim,et al. A Mathematical Theory of Communication , 2006 .
[3] Khalid Sayood,et al. Introduction to Data Compression , 1996 .
[4] Kellen Petersen August. Real Analysis , 2009 .