Divergence measures based on the Shannon entropy

A novel class of information-theoretic divergence measures based on the Shannon entropy is introduced. Unlike the well-known Kullback divergences, the new measures do not require the condition of absolute continuity to be satisfied by the probability distributions involved. More importantly, their close relationship with the variational distance and the probability of misclassification error are established in terms of bounds. These bounds are crucial in many applications of divergence measures. The measures are also well characterized by the properties of nonnegativity, finiteness, semiboundedness, and boundedness. >

[1]  H. Jeffreys An invariant form for the prior probability in estimation problems , 1946, Proceedings of the Royal Society of London. Series A. Mathematical and Physical Sciences.

[2]  S. M. Ali,et al.  A General Class of Coefficients of Divergence of One Distribution from Another , 1966 .

[3]  S. Kullback,et al.  A lower bound for discrimination information in terms of variation (Corresp.) , 1967, IEEE Trans. Inf. Theory.

[4]  T. T. Kadota,et al.  On the best finite set of linear observables for discriminating two Gaussian signals , 1967, IEEE Trans. Inf. Theory.

[5]  T. Kailath The Divergence and Bhattacharyya Distance Measures in Signal Selection , 1967 .

[6]  C. N. Liu,et al.  Approximating discrete probability distributions with dependence trees , 1968, IEEE Trans. Inf. Theory.

[7]  D. A. Bell,et al.  Information Theory and Reliable Communication , 1969 .

[8]  Solomon Kullback,et al.  Correction to A Lower Bound for Discrimination Information in Terms of Variation , 1970, IEEE Trans. Inf. Theory.

[9]  Igor Vajda,et al.  Note on discrimination information and variation (Corresp.) , 1970, IEEE Trans. Inf. Theory.

[10]  Martin E. Hellman,et al.  Probability of error, equivocation, and the Chernoff bound , 1970, IEEE Trans. Inf. Theory.

[11]  I. Vajda On thef-divergence and singularity of probability measures , 1972 .

[12]  Godfried T. Toussaint,et al.  Sharper lower bounds for discrimination information in terms of variation (Corresp.) , 1975, IEEE Trans. Inf. Theory.

[13]  C. H. Chen,et al.  On information and distance measures, error bounds, and feature selection , 1976, Information Sciences.

[14]  Silviu Guiaşu,et al.  Information theory with applications , 1977 .

[15]  Moshe Ben-Bassat,et al.  f-Entropies, probability of Error, and Feature Selection , 1978, Inf. Control..

[16]  J. A. Romero The information in contingency tables , 1979 .

[17]  Rodney W. Johnson,et al.  Axiomatic characterization of the directed divergences and their linear combinations , 1979, IEEE Trans. Inf. Theory.

[18]  Demetrios Kazakos,et al.  A Decision Theory Approach to the Approximation of Discrete Probability Densities , 1980, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[19]  Calyampudi R. Rao Diversity and dissimilarity coefficients: A unified approach☆ , 1982 .

[20]  C. R. Rao,et al.  Diversity: its measurement, decomposition, apportionment and analysis , 1982 .

[21]  C. R. Rao,et al.  Cross entropy, dissimilarity measures, and characterizations of quadratic entropy , 1985, IEEE Trans. Inf. Theory.

[22]  Andrew K. C. Wong,et al.  Entropy and Distance of Random Graphs with Application to Structural Pattern Recognition , 1985, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[23]  J. N. Kapur,et al.  SOME NORMALIZED MEASURES OF DIRECTED DIVERGENCE , 1986 .

[24]  I. Vajda Theory of statistical inference and information , 1989 .