Skip to main content
eScholarship
Open Access Publications from the University of California

UC Riverside

UC Riverside Previously Published Works bannerUC Riverside

A BAYESIAN CHARACTERIZATION OF RELATIVE ENTROPY

Abstract

We give a new characterization of relative entropy, also known as the Kullback-Leibler divergence. We use a number of interesting categories related to probability theory. In particular, we consider a category FinStat where an object is a Finite set equipped with a probability distribution, while a morphism is a measure-preserving function f: X → Y together with a stochastic right inverse s: Y → X. The function f can be thought of as a measurement process, while s provides a hypothesis about the state of the measured system given the result of a measurement. Given this data we can define the entropy of the probability distribution on X relative to the prior' given by pushing the probability distribution on Y forwards along s. We say that s is optimal' if these distributions agree. We show that any convex linear, lower semicontinuous functor from FinStat to the additive monoid [0;∞] which vanishes when s is optimal must be a scalar multiple of this relative entropy. Our proof is independent of all earlier characterizations, but inspired by the work of Petz. © John C. Baez and Tobias Fritz, 2014.

Many UC-authored scholarly publications are freely available on this site because of the UC's open access policies. Let us know how this access is important for you.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View