On the KL Divergence of Probability Mixtures for Belief Contraction

Probabilistic belief change is an operation that takes a probability distribution representing a belief state along with an input sentence representing some information to be accommodated or removed, and maps it to a new probability distribution. In order to choose from many such mappings possible, techniques from information theory such as the principle of minimum cross-entropy have previously been used. Central to this principle is the Kullback-Leibler (KL) divergence. In this short study, we focus on the contraction of a belief state P by a belief a, which is the process of turning the belief a into a non-belief. The contracted belief state \(P^-_a\) can be represented as a mixture of two states: the original belief state P, and the resultant state \(P^*_{\lnot a}\) of revising P by \(\lnot a\). Crucial to this mixture is the mixing factor \(\epsilon \) which determines the proportion of P and \(P^*_{\lnot a}\) that are to be used in this process. We show that once \(\epsilon \) is determined, the KL divergence of \(P^-_a\) from P is given by a function whose only argument is \(\epsilon \). We suggest that \(\epsilon \) is not only a mixing factor but also captures relevant aspects of P and \(P^*_{\lnot a}\) required for computing the KL divergence.