Skip to main content
eScholarship
Open Access Publications from the University of California

A characterization of entropy in terms of information loss

  • Author(s): Baez, JC
  • Fritz, T
  • Leinster, T
  • et al.

Published Web Location

https://doi.org/10.3390/e13111945
Abstract

There are numerous characterizations of Shannon entropy and Tsallis entropy as measures of information obeying certain properties. Using work by Faddeev and Furuichi, we derive a very simple characterization. Instead of focusing on the entropy of a probability measure on a finite set, this characterization focuses on the "information loss", or change in entropy, associated with a measure-preserving function. Information loss is a special case of conditional entropy: namely, it is the entropy of a random variable conditioned on some function of that variable. We show that Shannon entropy gives the only concept of information loss that is functorial, convex-linear and continuous. This characterization naturally generalizes to Tsallis entropy as well. © 2011 by the authors.

Many UC-authored scholarly publications are freely available on this site because of the UC Academic Senate's Open Access Policy. Let us know how this access is important for you.

Main Content
Current View