On divergences of finite measures and their applicability in statistics and information theory

Modifications of the classical φ-divergences D φ(μ, ν)=∈t q φ (p/q) dλ of finite measures μ, ν on a σ-finite measure space (𝒳, 𝒜, λ) with Radon–Nikodym densities p=dμ/dλ, q=dν/dλ are introduced by the formula 𝔇φ(μ, ν)=∈t q φ˜ (p/q) dλ using the nonnegative convex functions . Basic properties of the modified φ-divergences are investigated, such as the range of values, symmetry and a decomposition into local and global components. A general φ-divergence formula for right-censored observations illustrates the statistical applicability. The Pinsker inequality for finite measures and the generalized Ornstein distance of stationary random processes are among the illustrations of applicability in the information theory.

[1]  The sequential probability ratio test under random censorship , 1996 .

[2]  Jana Zvárová,et al.  On generalized entropies, Bayesian decisions and statistical diversity , 2007, Kybernetika.

[3]  J Zvárová,et al.  On genetic information, diversity and distance. , 2006, Methods of information in medicine.

[4]  H. Dalton The Measurement of the Inequality of Incomes , 1920 .

[5]  T.H. Crystal,et al.  Linear prediction of speech , 1977, Proceedings of the IEEE.

[6]  Andrei N. Kolmogorov,et al.  On the Shannon theory of information transmission in the case of continuous signals , 1956, IRE Trans. Inf. Theory.

[7]  Robert M. Gray,et al.  Speech coding based upon vector quantization , 1980, ICASSP.

[8]  C. Gini Variabilita e Mutabilita. , 1913 .

[9]  Igor Vajda,et al.  On Divergences and Informations in Statistics and Information Theory , 2006, IEEE Transactions on Information Theory.

[10]  I. Vajda,et al.  Convex Statistical Distances , 2018, Statistical Inference for Engineers and Data Scientists.

[11]  W. Stute Strong consistency of the MLE under random censoring , 1992 .

[12]  Takis Papaioannou,et al.  Information and Random Censoring , 1996, Inf. Sci..

[13]  Information, censoring, and dependence , 1990 .

[14]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[15]  H. Dalton Some Aspects of the Inequality of Incomes in Modern Communities , 2007 .

[16]  R. Renner,et al.  Generalized Entropies , 2012, 1211.3141.

[17]  I. Vajda,et al.  Optimal statistical decisions about some alternative financial models , 2007 .

[18]  Karl-Theodor Sturm,et al.  On the geometry of metric measure spaces. II , 2006 .

[19]  M. Pagano,et al.  Survival analysis. , 1996, Nutrition.

[20]  Leandro Pardo Llorente,et al.  Renyi statistics in directed families of exponential experiments , 2000 .

[21]  R. Gray,et al.  A Generalization of Ornstein's $\bar d$ Distance with Applications to Information Theory , 1975 .

[22]  W. Bossert,et al.  The Measurement of Diversity , 2001 .

[23]  Karl-Theodor Sturm,et al.  On the geometry of metric measure spaces , 2006 .

[24]  W. Stummer ON A STATISTICAL INFORMATION MEASURE FOR A GENERALIZED SAMUELSON-BLACK-SCHOLES MODEL , 2001 .

[25]  Jan Havrda,et al.  Quantification method of classification processes. Concept of structural a-entropy , 1967, Kybernetika.

[26]  Peter Harremoës,et al.  Refinements of Pinsker's inequality , 2003, IEEE Trans. Inf. Theory.

[27]  T. Papaioannou,et al.  Information in Quantal Response Data and Random Censoring , 2001 .

[28]  László Györfi,et al.  A Probabilistic Theory of Pattern Recognition , 1996, Stochastic Modelling and Applied Probability.

[29]  V. Akila,et al.  Information , 2001, The Lancet.