On Measure Theoretic definitions of Generalized Information Measures and Maximum Entropy Prescriptions

Though Shannon entropy of a probability measure $P$, defined as $- \int_{X} \frac{\ud P}{\ud \mu} \ln \frac{\ud P}{\ud\mu} \ud \mu$ on a measure space $(X, \mathfrak{M},\mu)$, does not qualify itself as an information measure (it is not a natural extension of the discrete case), maximum entropy (ME) prescriptions in the measure-theoretic case are consistent with that of discrete case. In this paper, we study the measure-theoretic definitions of generalized information measures and discuss the ME prescriptions. We present two results in this regard: (i) we prove that, as in the case of classical relative-entropy, the measure-theoretic definitions of generalized relative-entropies, R\'{e}nyi and Tsallis, are natural extensions of their respective discrete cases, (ii) we show that, ME prescriptions of measure-theoretic Tsallis entropy are consistent with the discrete case.

[1]  R. A. Leibler,et al.  On Information and Sufficiency , 1951 .

[2]  M. Mead,et al.  Cybernetics , 1953, The Yale Journal of Biology and Medicine.

[3]  A. Rényi On the dimension and entropy of probability distributions , 1959 .

[4]  L. Goddard Information Theory , 1962, Nature.

[5]  Amiel Feinstein,et al.  Information and information stability of random variables and processes , 1964 .

[6]  M. Rosenblatt-Roth The Concept of Entropy in Probability Theory and Its Application in the Theory of Information Transmission Through Communication Channels , 1964 .

[7]  W. Rudin Real and complex analysis , 1968 .

[8]  Edwin T. Jaynes,et al.  Prior Probabilities , 1968, Encyclopedia of Machine Learning.

[9]  W. Ochs Basic properties of the generalized Boltzmann-Gibbs- Shannon entropy , 1976 .

[10]  A. Rényi,et al.  Selected papers of Alfréd Rényi , 1976 .

[11]  Silviu Guiaşu,et al.  Information theory with applications , 1977 .

[12]  Pushpa N. Rathie,et al.  On the entropy of continuous probability distributions (Corresp.) , 1978, IEEE Trans. Inf. Theory.

[13]  Richard A. Highfield,et al.  Calculation of maximum entropy distributions and approximation of marginalposterior distributions , 1988 .

[14]  R. Gray Entropy and Information Theory , 1990, Springer New York.

[15]  Robert M. Gray,et al.  Entropy and Information , 1990 .

[16]  J. N. Kapur,et al.  Entropy optimization principles with applications , 1992 .

[17]  P. Masani The measure-theoretic aspects of entropy, Part I , 1992 .

[18]  H. Ryu Maximum entropy estimation of density and regression functions , 1993 .

[19]  C. Tsallis,et al.  The role of constraints within generalized nonextensive statistics , 1998 .

[20]  Jonathan D. H. Smith Some Observations on the Concepts of Information-Theoretic Entropy and Randomness , 2001, Entropy.

[21]  S. Kantorovitz Introduction to Modern Analysis , 2003 .

[22]  A. Plastino,et al.  The role of constraints in Tsallis' nonextensive treatment revisited , 2005 .

[23]  A. M. Scarfone,et al.  Connections between Tsallis' formalisms employing the standard linear average energy and ones employing the normalized q-average energy , 2004, cond-mat/0410527.

[24]  S. Abe,et al.  Necessity of q-expectation value in nonextensive statistical mechanics. , 2005, Physical review. E, Statistical, nonlinear, and soft matter physics.

[25]  Sang Joon Kim,et al.  A Mathematical Theory of Communication , 2006 .

[26]  Dan A. Simovici,et al.  On Generalized Entropy and Entropic Metrics , 2007, J. Multiple Valued Log. Soft Comput..