暂无分享,去创建一个
[1] Rodney W. Johnson,et al. Comments on and correction to 'Axiomatic derivation of the principle of maximum entropy and the principle of minimum cross-entropy' (Jan 80 26-37) , 1983, IEEE Trans. Inf. Theory.
[2] Suguru Arimoto,et al. Information-Theoretical Considerations on Estimation Problems , 1971, Inf. Control..
[3] C. Tsallis,et al. Two-parameter generalization of the logarithm and exponential functions and Boltzmann-Gibbs-Shannon entropy , 2007, cond-mat/0703792.
[4] Shigeru Furuichi,et al. Some inequalities on generalized entropies , 2011, ArXiv.
[5] Hinrich Schütze,et al. Book Reviews: Foundations of Statistical Natural Language Processing , 1999, CL.
[6] J. Aczel,et al. On Measures of Information and Their Characterizations , 2012 .
[7] Fevzi Büyükkiliç,et al. A fractal approach to entropy and distribution functions , 1993 .
[8] G. Crooks. Inequalities between the Jenson-Shannon and Jeffreys divergences , 2008 .
[9] C. Tsallis. Possible generalization of Boltzmann-Gibbs statistics , 1988 .
[10] A. R. Plastino,et al. On the cut-off prescriptions associated with power-law generalized thermostatistics , 2005 .
[11] H. Jeffreys. An invariant form for the prior probability in estimation problems , 1946, Proceedings of the Royal Society of London. Series A. Mathematical and Physical Sciences.
[12] Fevzi Büyükiliç,et al. A statistical mechanical approach to generalized statistics of quantum and classical gases , 1995 .
[13] M. Masi. A step beyond Tsallis and Rényi entropies , 2005, cond-mat/0505107.
[14] Shigeru Furuichi,et al. An axiomatic characterization of a two-parameter extended relative entropy , 2008, ArXiv.
[15] Thomas M. Cover,et al. Elements of Information Theory , 2005 .
[16] Ambedkar Dukkipati. On Kolmogorov-Nagumo averages and nonextensive entropy , 2010, 2010 International Symposium On Information Theory & Its Applications.
[17] Jianhua Lin,et al. Divergence measures based on the Shannon entropy , 1991, IEEE Trans. Inf. Theory.
[18] G. Bagci,et al. Is Sharma-Mittal entropy really a step beyond Tsallis and Renyi entropies? , 2007, cond-mat/0703277.
[19] Rodney W. Johnson,et al. Axiomatic derivation of the principle of maximum entropy and the principle of minimum cross-entropy , 1980, IEEE Trans. Inf. Theory.
[20] R. Johnson,et al. Properties of cross-entropy minimization , 1981, IEEE Trans. Inf. Theory.
[21] G. Kaniadakis,et al. Maximum entropy principle and power-law tailed distributions , 2009, 0904.4180.
[22] T. Wada,et al. Multiplicative duality, q-triplet and (μ,ν,q)-relation derived from the one-to-one correspondence between the (μ,ν)-multinomial coefficient and Tsallis entropy Sq , 2007, cond-mat/0702103.
[23] S. Furuichi. On the maximum entropy principle and the minimization of the Fisher information in Tsallis statistics , 2009, 1001.1383.
[24] M. Masi. Generalized information-entropy measures and Fisher information , 2006, cond-mat/0611300.
[25] H. Suyari. Mathematical structures derived from the q-multinomial coefficient in Tsallis statistics , 2004, cond-mat/0401546.
[26] Dick E. Boekee,et al. The R-Norm Information Measure , 1980, Inf. Control..
[27] A. R. Plastino,et al. Thermodynamic Consistency of the $q$-Deformed Fermi-Dirac Distribution in Nonextensive Thermostatics , 2010, 1006.3963.