Skip to main content
eScholarship
Open Access Publications from the University of California

Information Theory Meets Expected Utility: The Entropic Roots of ProbabilityWeighting Functions

Abstract

This paper proposes that the shape and parameter fits of existing probability weighting functions can be explained withsensitivity to uncertainty (as measured by information entropy) and the utility carried by reductions in uncertainty. Build-ing on applications of information theoretic principles to models of perceptual and inferential processes, I suggest thatprobabilities are evaluated relative to the distribution of maximum entropy (the uniform distribution) and that the per-ceived distance between a probability and uniformity is influenced by the shape (relative entropy) of the distribution thatthe probability is embedded in. These intuitions are formalized in a novel probability weighting function, VWD(p), whichis simpler and has less free parameters than existing probability weighting functions. VWD(p) captures characteristicfeatures of existing probability weighting functions, introduces novel predictions, and provides a parsimonious account offindings in probability and frequency estimation related tasks.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View