Cross entropy, dissimilarity measures, and characterizations of quadratic entropy

A unified approach is given for constructing cross entropy and dissimilarity measures between probability distributions, based on a given entropy function or a diversity measure. Special properties of quadratic entropy introduced by Rao [7] are described. In particular it is shown that the square root of the Jensen difference (dissimilarity measure) arising out of a quadratic entropy provides a metric on a probability space. Several characterizations of quadratic entropy are obtained.