A NEW DIRECTED DIVERGENCE MEASURE AND ITS CHARACTERIZATION

A new information-theoretic divergence measure is introduced and characterized. This new measure is related to the Kullback directed divergence but does not require the condition of absolute continuity to be satisfied by the probability distributions involved. Moreover, both the lower and upper bounds for the new measure are established in terms of the variational distance. A symmetric form of the divergence can also be defined and described by the Shannon entropy function. Other properties of the new divergences: nonnegativity, finiteness, semiboundedness, and boundedness are discussed

[1]  R. A. Leibler,et al.  On Information and Sufficiency , 1951 .

[2]  Solomon Kullback,et al.  Information Theory and Statistics , 1960 .

[3]  S. Kullback,et al.  A lower bound for discrimination information in terms of variation (Corresp.) , 1967, IEEE Trans. Inf. Theory.

[4]  T. T. Kadota,et al.  On the best finite set of linear observables for discriminating two Gaussian signals , 1967, IEEE Trans. Inf. Theory.

[5]  C. N. Liu,et al.  Approximating discrete probability distributions with dependence trees , 1968, IEEE Trans. Inf. Theory.

[6]  R. Gallager Information Theory and Reliable Communication , 1968 .

[7]  Igor Vajda,et al.  Note on discrimination information and variation (Corresp.) , 1970, IEEE Trans. Inf. Theory.

[8]  Solomon Kullback,et al.  Information Theory and Statistics , 1970, The Mathematical Gazette.

[9]  Tzay Y. Young The Reliability of Linear Feature Extractors , 1971, IEEE Transactions on Computers.

[10]  Godfried T. Toussaint,et al.  Sharper lower bounds for discrimination information in terms of variation (Corresp.) , 1975, IEEE Trans. Inf. Theory.

[11]  Rodney W. Johnson,et al.  Axiomatic characterization of the directed divergences and their linear combinations , 1979, IEEE Trans. Inf. Theory.

[12]  Demetrios Kazakos,et al.  A Decision Theory Approach to the Approximation of Discrete Probability Densities , 1980, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[13]  Calyampudi R. Rao Diversity and dissimilarity coefficients: A unified approach☆ , 1982 .

[14]  C. R. Rao,et al.  Cross entropy, dissimilarity measures, and characterizations of quadratic entropy , 1985, IEEE Trans. Inf. Theory.

[15]  J. N. Kapur,et al.  SOME NORMALIZED MEASURES OF DIRECTED DIVERGENCE , 1986 .