On the KL divergence of probability mixtures for belief contraction

Kinzang Chhogyal*, Abhaya Nayak, Abdul Sattar

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference proceeding contribution

1 Citation (Scopus)

Abstract

Probabilistic belief change is an operation that takes a probability distribution representing a belief state along with an input sentence representing some information to be accommodated or removed, and maps it to a new probability distribution. In order to choose from many such mappings possible, techniques from information theory such as the principle of minimum cross-entropy have previously been used. Central to this principle is the Kullback-Leibler (KL) divergence. In this short study, we focus on the contraction of a belief state P by a belief a, which is the process of turning the belief a into a non-belief. The contracted belief state Pa can be represented as a mixture of two states: the original belief state P, and the resultant state P*¬a of revising P by ¬a. Crucial to this mixture is the mixing factor ∊ which determines the proportion of P and P*¬a that are to be used in this process. We show that once ∊ is determined, the KL divergence of Pa from P is given by a function whose only argument is ∊. We suggest that ∊ is not only a mixing factor but also captures relevant aspects of P and P*¬a required for computing the KL divergence.

Original languageEnglish
Title of host publicationKI 2015: Advances in Artificial Intelligence
Subtitle of host publication38th Annual German Conference on AI, Dresden, Germany, September 21-25, 2015, Proceedings
EditorsSteffen Hölldobler, Markus Krötzsch, Rafael Peñaloza, Sebastian Rudolph
Place of PublicationCham
PublisherSpringer, Springer Nature
Pages249-255
Number of pages7
ISBN (Electronic)9783319244891
ISBN (Print)9783319244884
DOIs
Publication statusPublished - 2015
Event38th Annual German Conference on Advances in Artificial Intelligence, AI 2015 - Dresden, Germany
Duration: 21 Sep 201525 Sep 2015

Publication series

NameLecture Notes in Computer Science
PublisherSpringer International Publishing
Volume9324
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Other

Other38th Annual German Conference on Advances in Artificial Intelligence, AI 2015
CountryGermany
CityDresden
Period21/09/1525/09/15

    Fingerprint

Cite this

Chhogyal, K., Nayak, A., & Sattar, A. (2015). On the KL divergence of probability mixtures for belief contraction. In S. Hölldobler, M. Krötzsch, R. Peñaloza, & S. Rudolph (Eds.), KI 2015: Advances in Artificial Intelligence: 38th Annual German Conference on AI, Dresden, Germany, September 21-25, 2015, Proceedings (pp. 249-255). [A20] (Lecture Notes in Computer Science; Vol. 9324). Cham: Springer, Springer Nature. https://doi.org/10.1007/978-3-319-24489-1_20