Probabilistic belief change is an operation that takes a probability distribution representing a belief state along with an input sentence representing some information to be accommodated or removed, and maps it to a new probability distribution. In order to choose from many such mappings possible, techniques from information theory such as the principle of minimum cross-entropy have previously been used. Central to this principle is the Kullback-Leibler (KL) divergence. In this short study, we focus on the contraction of a belief state P by a belief a, which is the process of turning the belief a into a non-belief. The contracted belief state P−a can be represented as a mixture of two states: the original belief state P, and the resultant state P*￢a of revising P by ¬a. Crucial to this mixture is the mixing factor ∊ which determines the proportion of P and P*￢a that are to be used in this process. We show that once ∊ is determined, the KL divergence of P−a from P is given by a function whose only argument is ∊. We suggest that ∊ is not only a mixing factor but also captures relevant aspects of P and P*￢a required for computing the KL divergence.