Some properties of Rényi entropy and Rényi entropy rate

In this paper, we define the conditional Renyi entropy and show that the so-called chain rule holds for the Renyi entropy. Then, we introduce a relation for the rate of Renyi entropy and use it to derive the rate of the Renyi entropy for an irreducible-aperiodic Markov chain. We also show that the bound for the Renyi entropy rate is simply the Shannon entropy rate.

[1]  Philippe Jacquet,et al.  On the entropy of a hidden Markov process , 2008, Theor. Comput. Sci..

[2]  Te Sun Han,et al.  The asymptotics of posterior entropy and error probability for Bayesian estimation , 1995, IEEE Trans. Inf. Theory.

[3]  V. Kirchanov Using the Renyi entropy to describe quantum dissipative systems in statistical mechanics , 2008 .

[4]  Louis Sucheston,et al.  On convergence of information in spaces with infinite invariant measure , 1968 .

[5]  Jeff B. Paris,et al.  Inference Processes for Quantified Predicate Knowledge , 2008, WoLLIC.

[6]  Attila Andai,et al.  On the geometry of generalized Gaussian distributions , 2007, J. Multivar. Anal..

[7]  Fady Alajaji,et al.  Rényi's divergence and entropy rates for finite alphabet Markov sources , 2001, IEEE Trans. Inf. Theory.

[8]  Imre Csiszár Generalized cutoff rates and Renyi's information measures , 1995, IEEE Trans. Inf. Theory.

[9]  Charalambos D. Charalambous,et al.  Robust coding for a class of sources: Applications in control and reliable communication over limited capacity channels , 2008, Syst. Control. Lett..

[10]  S. Bentes,et al.  Long Memory and Volatility Clustering: is the empirical evidence consistent across stock markets? , 2007, 0709.2178.

[11]  Mitio Nagumo Über eine Klasse der Mittelwerte , 1930 .

[12]  Douglas E. Lake,et al.  Renyi entropy measures of heart rate Gaussianity , 2006, IEEE Transactions on Biomedical Engineering.

[13]  C. E. SHANNON,et al.  A mathematical theory of communication , 1948, MOCO.

[14]  Hidetoshi Shimokawa Rényi's Entropy and Error Exponent of Source Coding with Countably Infinite Alphabet , 2006, 2006 IEEE International Symposium on Information Theory.

[15]  Jean-François Bercher,et al.  On some entropy functionals derived from Rényi information divergence , 2008, Inf. Sci..

[16]  Stefano Maria Iacus,et al.  On Rényi information for ergodic diffusion processes , 2007, Inf. Sci..

[17]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[18]  H. K. Wimmer Spectral Radius and Radius of Convergence , 1974 .

[19]  Alfréd Rényi,et al.  Probability Theory , 1970 .

[20]  Christian Cachin,et al.  Entropy measures and unconditional security in cryptography , 1997 .

[21]  W. Bastiaan Kleijn,et al.  On the Estimation of Differential Entropy From Data Located on Embedded Manifolds , 2007, IEEE Transactions on Information Theory.

[22]  A. Rényi On Measures of Entropy and Information , 1961 .

[23]  M. Wolf,et al.  Conditional entropies and their relation to entanglement criteria , 2002, quant-ph/0202058.

[24]  Aleksandr Yakovlevich Khinchin,et al.  Mathematical foundations of information theory , 1959 .

[25]  Silviu Guiaşu,et al.  Information theory with applications , 1977 .

[26]  P. Jizba,et al.  The world according to R enyi: thermodynamics of multifractal systems , 2002, cond-mat/0207707.

[27]  Robert Jenssen,et al.  A new information theoretic analysis of sum-of-squared-error kernel clustering , 2008, Neurocomputing.

[28]  M. Narasimha Murty,et al.  Gelfand-Yaglom-Perez theorem for generalized relative entropy functionals , 2007, Inf. Sci..

[29]  Krzysztof Onak,et al.  Streaming algorithms for estimating entropy , 2008, 2008 IEEE Information Theory Workshop.