Information Theory and its Applications
暂无分享,去创建一个
[1] Daniel B. Nelson. CONDITIONAL HETEROSKEDASTICITY IN ASSET RETURNS: A NEW APPROACH , 1991 .
[2] S. M. Sunoj,et al. Dynamic cumulative residual Renyi's entropy , 2012 .
[3] A. Barron,et al. Fisher information inequalities and the central limit theorem , 2001, math/0111020.
[4] R. Guy. Sets of Integers Whose Subsets Have Distinct Sums , 1982 .
[5] Mokshay M. Madiman,et al. Generalized Entropy Power Inequalities and Monotonicity Properties of Information , 2006, IEEE Transactions on Information Theory.
[6] H. Nyquist,et al. Certain factors affecting telegraph speed , 1924, Journal of the A.I.E.E..
[7] R. Hartley. Transmission of information , 1928 .
[8] J. Linnik. An Information-Theoretic Proof of the Central Limit Theorem with Lindeberg Conditions , 1959 .
[9] Limiting behavior of relative Rényi entropy in a non-regular location shift family , 2002, math/0212077.
[10] S. Nadarajah. A generalized normal distribution , 2005 .
[11] Yiming Ding,et al. The convergence of the Renyi entropy of the normalized sums of IID random variables , 2010 .
[12] Bovas Abraham,et al. Renyi's entropy for residual lifetime distribution , 2006 .
[13] Murali Rao,et al. More on a New Concept of Entropy and Information , 2005 .
[14] O. Johnson. Information Theory And The Central Limit Theorem , 2004 .
[15] C. R. Rao,et al. On the convexity of some divergence measures based on entropy functions , 1982, IEEE Trans. Inf. Theory.
[16] A. Barron. ENTROPY AND THE CENTRAL LIMIT THEOREM , 1986 .
[17] Yunmei Chen,et al. Cumulative residual entropy: a new measure of information , 2004, IEEE Transactions on Information Theory.
[18] Thomas M. Cover,et al. Elements of Information Theory , 2005 .
[19] Majid Asadi,et al. On the dynamic cumulative residual entropy , 2007 .
[20] Assaf Naor,et al. On the rate of convergence in the entropic central limit theorem , 2004 .
[21] A note on entropies of l-max stable, p-max stable, generalized Pareto and generalized log-Pareto distributions , 2012 .
[22] Seiji Takano. CONVERGENCE OF ENTROPY IN THE CENTRAL LIMIT THEOREM , 1987 .
[23] Ryoichi Shimizu,et al. On Fisher’s Amount of Information for Location Family , 1975 .
[24] S. Kullback,et al. A lower bound for discrimination information in terms of variation (Corresp.) , 1967, IEEE Trans. Inf. Theory.
[25] Csaba Sándor. On a Problem of Erdos , 1997 .
[26] M. Menéndez,et al. (h, Φ)-entropy differential metric , 1997 .
[27] Amir Dembo,et al. Information theoretic inequalities , 1991, IEEE Trans. Inf. Theory.
[28] Antonio Di Crescenzo,et al. On Weighted Residual and Past Entropies , 2006 .
[29] Claude E. Shannon,et al. A Mathematical Theory of Communications , 1948 .
[30] Antonia Maria Tulino,et al. Monotonic Decrease of the Non-Gaussianness of the Sum of Independent Random Variables: A Simple Proof , 2006, IEEE Transactions on Information Theory.
[31] C. Vignat,et al. Some results concerning maximum Renyi entropy distributions , 2005, math/0507400.
[32] Antonio Di Crescenzo,et al. On cumulative entropies , 2009 .
[33] Solomon Kullback,et al. Correction to A Lower Bound for Discrimination Information in Terms of Variation , 1970, IEEE Trans. Inf. Theory.
[34] Nader Ebrahimi,et al. Ordering univariate distributions by entropy and variance , 1999 .
[35] D. Berry,et al. Statistics: Theory and Methods , 1990 .
[36] Isaac L. Chuang,et al. Quantum Computation and Quantum Information: Frontmatter , 2010 .
[37] K. Ball,et al. Solution of Shannon's problem on the monotonicity of entropy , 2004 .
[38] Leandro Pardo,et al. Asymptotic distribution of (h, φ)-entropies , 1993 .
[39] A. Rényi. On Measures of Entropy and Information , 1961 .
[40] A. Wyner,et al. On communication of analog data from a bounded source space , 1969 .