Information theoretical properties of Tsallis entropies

A chain rule and a subadditivity for the entropy of type β, which is one of the nonadditive entropies, were derived by Daroczy. In this paper, we study the further relations among Tsallis type entropies which are typical nonadditive entropies. The chain rule is generalized by showing it for Tsallis relative entropy and the nonadditive entropy. We show some inequalities related to Tsallis entropies, especially the strong subadditivity for Tsallis type entropies and the subadditivity for the nonadditive entropies. The subadditivity and the strong subadditivity naturally lead to define Tsallis mutual entropy and Tsallis conditional mutual entropy, respectively, and then we show again chain rules for Tsallis mutual entropies. We give properties of entropic distances in terms of Tsallis entropies. Finally we show parametrically extended results based on information theory.

[1]  Sang Joon Kim,et al.  A Mathematical Theory of Communication , 2006 .

[2]  Constantino Tsallis,et al.  Asymptotically scale-invariant occupancy of phase space makes the entropy Sq extensive , 2005, Proceedings of the National Academy of Sciences of the United States of America.

[3]  S. Furuichi On uniqueness Theorems for Tsallis entropy and Tsallis relative entropy , 2005, IEEE Transactions on Information Theory.

[4]  K. Yanagi,et al.  Fundamental properties of Tsallis relative entropy , 2004, cond-mat/0406178.

[5]  H. Suyari Generalization of Shannon-Khinchin axioms to nonextensive systems and the uniqueness theorem for the nonextensive entropy , 2002, IEEE Transactions on Information Theory.

[6]  H. Suyari,et al.  COMMENT: Comment on 'Note on generalization of Shannon theorem and inequality' , 2003 .

[7]  S. Abe Stability of Tsallis entropy and instabilities of Rényi and normalized Tsallis entropies: a basis for q-exponential distributions. , 2002, Physical review. E, Statistical, nonlinear, and soft matter physics.

[8]  H. Suyari Nonextensive entropies derived from form invariance of pseudoadditivity. , 2001, Physical review. E, Statistical, nonlinear, and soft matter physics.

[9]  T. Yamano Information theory based on nonadditive information content. , 2000, Physical review. E, Statistical, nonlinear, and soft matter physics.

[10]  阿部 純義,et al.  Nonextensive statistical mechanics and its applications , 2001 .

[11]  S. Abe Axioms and uniqueness theorem for Tsallis entropy , 2000, cond-mat/0005538.

[12]  Funabashi,et al.  Nonadditive conditional entropy and its significance for local realism , 2000, quant-ph/0001085.

[13]  Funabashi,et al.  Implications of Form Invariance to the Structure of Nonextensive Entropies , 1999, quant-ph/9904029.

[14]  Constantino Tsallis,et al.  Erratum: “Information gain within generalized thermostatistics” [J. Math. Phys. 39, 6490 (1998)] , 1999 .

[15]  C. Tsallis,et al.  Information gain within nonextensive thermostatistics , 1998 .

[16]  M. Shiino H-Theorem with Generalized Relative Entropies and the Tsallis Statistics , 1998 .

[17]  K. S. Fa NOTE ON GENERALIZATION OF SHANNON THEOREM AND INEQUALITY , 1998 .

[18]  C. Tsallis Generalized entropy-based criterion for consistent testing , 1998 .

[19]  Roberto Santos,et al.  Generalization of Shannon’s theorem for Tsallis entropy , 1997 .

[20]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[21]  C. Tsallis Possible generalization of Boltzmann-Gibbs statistics , 1988 .

[22]  Zoltán Daróczy,et al.  Generalized Information Functions , 1970, Inf. Control..

[23]  Jan Havrda,et al.  Quantification method of classification processes. Concept of structural a-entropy , 1967, Kybernetika.

[24]  C. Rajski,et al.  A Metric Space of Discrete Probability Distributions , 1961, Inf. Control..