Information Theory Meets Expected Utility: The Entropic Roots of Probability Weighting Functions
Skip to main content
eScholarship
Open Access Publications from the University of California

Information Theory Meets Expected Utility: The Entropic Roots of Probability Weighting Functions

Abstract

This paper proposes that the shape and parameter fits of existing probability weighting functions can be explained with sensitivity to uncertainty (as measured by information entropy) and the utility carried by reductions in uncertainty. Building on applications of information theoretic principles to models of perceptual and inferential processes, we suggest that probabilities are evaluated relative to a plausible expectation (the uniform distribution) and that the perceived distance between a probability and uniformity is influenced by the shape (relative entropy) of the distribution that the probability is embedded in. These intuitions are formalized in a novel probability weighting function, VWD(p), which is simpler and has less parameters than existing probability weighting functions. The proposed probability weighting function captures characteristic features of existing probability weighting functions, introduces novel predictions, and provides a parsimonious account of findings in probability and frequency estimation related tasks.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View