ON RÉNYI DIVERGENCE MEASURES FOR CONTINUOUS ALPHABET SOURCES
暂无分享,去创建一个
[1] E. Hellinger,et al. Neue Begründung der Theorie quadratischer Formen von unendlichvielen Veränderlichen. , 1909 .
[2] A. D. Michal. Matrix and tensor calculus , 1947 .
[3] C. E. SHANNON,et al. A mathematical theory of communication , 1948, MOCO.
[4] R. A. Leibler,et al. On Information and Sufficiency , 1951 .
[5] H. Chernoff. A Measure of Asymptotic Efficiency for Tests of a Hypothesis Based on the sum of Observations , 1952 .
[6] U. Grenander,et al. Toeplitz Forms And Their Applications , 1958 .
[7] E. Lehmann. Testing Statistical Hypotheses , 1960 .
[8] A. Rényi. On Measures of Entropy and Information , 1961 .
[9] Milton Abramowitz,et al. Handbook of Mathematical Functions with Formulas, Graphs, and Mathematical Tables , 1964 .
[10] L. L. Campbell,et al. A Coding Theorem and Rényi's Entropy , 1965, Inf. Control..
[11] M. Abramowitz,et al. Handbook of Mathematical Functions With Formulas, Graphs and Mathematical Tables (National Bureau of Standards Applied Mathematics Series No. 55) , 1965 .
[12] S. M. Ali,et al. A General Class of Coefficients of Divergence of One Distribution from Another , 1966 .
[13] W. Rudin. Real and complex analysis , 1968 .
[14] Josef Kittler,et al. Pattern recognition : a statistical approach , 1982 .
[15] Gerald B. Folland,et al. Real Analysis: Modern Techniques and Their Applications , 1984 .
[16] J. Burbea. The convexity with respect to Gaussian distributions of divergences of order a , 1984 .
[17] J. Magnus,et al. Matrix Differential Calculus with Applications in Statistics and Econometrics (Revised Edition) , 1999 .
[18] M. Basseville. Distance measures for signal processing and pattern recognition , 1989 .
[19] I. Vajda. Theory of statistical inference and information , 1989 .
[20] G. Casella,et al. Statistical Inference , 2003, Encyclopedia of Social Network Analysis and Mining.
[21] Jianhua Lin,et al. Divergence measures based on the Shannon entropy , 1991, IEEE Trans. Inf. Theory.
[22] Shunsuke Ihara,et al. Information theory - for continuous systems , 1993 .
[23] N. L. Johnson,et al. Continuous Univariate Distributions. , 1995 .
[24] Imre Csiszár. Generalized cutoff rates and Renyi's information measures , 1995, IEEE Trans. Inf. Theory.
[25] J. H. van Schuppen,et al. System identification with information theoretic criteria , 1995 .
[26] Christian Cachin,et al. Entropy measures and unconditional security in cryptography , 1997 .
[27] K. F. Riley,et al. Mathematical Methods for Physics and Engineering , 1998 .
[28] Julio Angel Pardo,et al. Use of Rényi's divergence to test for the equality of the coefficients of variation , 2000 .
[29] S. Roberts,et al. Bayesian methods for autoregressive models , 2000, Neural Networks for Signal Processing X. Proceedings of the 2000 IEEE Signal Processing Society Workshop (Cat. No.00TH8501).
[30] Igor Vajda,et al. Entropy expressions for multivariate continuous distributions , 2000, IEEE Trans. Inf. Theory.
[31] K. Song. Rényi information, loglikelihood and an intrinsic distribution measure , 2001 .
[32] Fady Alajaji,et al. Rényi's divergence and entropy rates for finite alphabet Markov sources , 2001, IEEE Trans. Inf. Theory.
[33] Alison L Gibbs,et al. On Choosing and Bounding Probability Metrics , 2002, math/0209021.
[34] Kostas Zografos,et al. Formulas for Rényi information and related measures for univariate distributions , 2003, Inf. Sci..
[35] Fady Alajaji,et al. Csisza/spl acute/r's cutoff rates for the general hypothesis testing problem , 2004, IEEE Transactions on Information Theory.
[36] P. Jizba,et al. The world according to R enyi: thermodynamics of multifractal systems , 2002, cond-mat/0207707.
[37] Robert M. Gray,et al. Toeplitz and Circulant Matrices: A Review , 2005, Found. Trends Commun. Inf. Theory.
[38] M. Asadi,et al. Information Measures for Pareto Distributions and Order Statistics , 2006 .
[39] Igor Vajda,et al. On Divergences and Informations in Statistics and Information Theory , 2006, IEEE Transactions on Information Theory.
[40] P. Harremoës. Interpretations of Rényi entropies and divergences , 2006 .
[41] Karsten Berns,et al. Probabilistic distance measures of the Dirichlet and Beta distributions , 2008, Pattern Recognit..
[42] Imre Csiszár,et al. Axiomatic Characterizations of Information Measures , 2008, Entropy.
[43] Karl Pearson F.R.S.. X. On the criterion that a given system of deviations from the probable in the case of a correlated system of variables is such that it can be reasonably supposed to have arisen from random sampling , 2009 .
[44] D. Morales,et al. Rényi statistics for testing equality of autocorrelation coefficients , 2009 .
[45] Gholamhossein Yari,et al. Some properties of Rényi entropy and Rényi entropy rate , 2009, Inf. Sci..
[46] Einollah Pasha,et al. Rényi entropy rate for Gaussian processes , 2010, Inf. Sci..
[47] Peter Harremoës,et al. Rényi divergence and majorization , 2010, 2010 IEEE International Symposium on Information Theory.
[48] J. Aczel,et al. On Measures of Information and Their Characterizations , 2012 .
[49] Stephen H. Friedberg,et al. Linear Algebra , 2018, Computational Mathematics with SageMath.
[50] I. Vajda,et al. Convex Statistical Distances , 2018, Statistical Inference for Engineers and Data Scientists.