On Some Properties of Tsallis Hypoentropies and Hypodivergences

Both the Kullback–Leibler and the Tsallis divergence have a strong limitation: if the value zero appears in probability distributions (p1, ··· , pn) and (q1, ··· , qn), it must appear in the same positions for the sake of significance. In order to avoid that limitation in the framework of Shannon statistics, Ferreri introduced in 1980 hypoentropy: “such conditions rarely occur in practice”. The aim of the present paper is to extend Ferreri’s hypoentropy to the Tsallis statistics. We introduce the Tsallis hypoentropy and the Tsallis hypodivergence and describe their mathematical behavior. Fundamental properties, like nonnegativity, monotonicity, the chain rule and subadditivity, are established.

[1]  C. Tsallis Generalized entropy-based criterion for consistent testing , 1998 .

[2]  H. Heyer,et al.  Information and Sufficiency , 1982 .

[3]  Sang Joon Kim,et al.  A Mathematical Theory of Communication , 2006 .

[4]  H. Suyari Mathematical structures derived from the q-multinomial coefficient in Tsallis statistics , 2004, cond-mat/0401546.

[5]  Gustavo L. Gilardoni On Pinsker's and Vajda's Type Inequalities for Csiszár's $f$ -Divergences , 2006, IEEE Transactions on Information Theory.

[6]  H. Jeffreys An invariant form for the prior probability in estimation problems , 1946, Proceedings of the Royal Society of London. Series A. Mathematical and Physical Sciences.

[7]  C. Tsallis Possible generalization of Boltzmann-Gibbs statistics , 1988 .

[8]  R. A. Leibler,et al.  On Information and Sufficiency , 1951 .

[9]  Zoltán Daróczy,et al.  Generalized Information Functions , 1970, Inf. Control..

[10]  Friedrich Hasenöhrl,et al.  Wissenschaftliche Abhandlungen: Einige allgemeine Sätze über Wärmegleichgewicht , 2012 .

[11]  Jianhua Lin,et al.  Divergence measures based on the Shannon entropy , 1991, IEEE Trans. Inf. Theory.

[12]  Imre Csiszár,et al.  Axiomatic Characterizations of Information Measures , 2008, Entropy.

[13]  Jan Havrda,et al.  Quantification method of classification processes. Concept of structural a-entropy , 1967, Kybernetika.

[14]  S. Furuichi,et al.  Mathematical inequalities for some divergences , 2011, ArXiv.

[15]  Ernesto P. Borges A possible deformed algebra and calculus inspired in nonextensive thermostatistics , 2003, cond-mat/0304545.

[16]  A. Ben Hamza,et al.  Nonextensive information-theoretic measure for image edge detection , 2006, J. Electronic Imaging.

[17]  S. Furuichi Information theoretical properties of Tsallis entropies , 2004, cond-mat/0405600.

[18]  A. E. Rastegin,et al.  Continuity estimates on the Tsallis relative entropy , 2011 .

[19]  D. Petz,et al.  Some inequalities for quantum tsallis entropy related to the strong subadditivity , 2014, 1403.7062.

[20]  J. Gibbs Elementary Principles in Statistical Mechanics: Developed with Especial Reference to the Rational Foundation of Thermodynamics , 1902 .

[21]  R. Hartley Transmission of information , 1928 .

[22]  J. Aczel,et al.  On Measures of Information and Their Characterizations , 2012 .

[23]  P. Pearce PRINCIPLES OF STATISTICAL MECHANICS , 1998 .

[24]  Huaiyu Zhu On Information and Sufficiency , 1997 .

[25]  C. Tsallis,et al.  Information gain within nonextensive thermostatistics , 1998 .

[26]  S. Furuichi On uniqueness Theorems for Tsallis entropy and Tsallis relative entropy , 2005, IEEE Transactions on Information Theory.