This paper proposes that the shape and parameter fits of
existing probability weighting functions can be explained with
sensitivity to uncertainty (as measured by information entropy)
and the utility carried by reductions in uncertainty. Building on
applications of information theoretic principles to models of
perceptual and inferential processes, we suggest that
probabilities are evaluated relative to a plausible expectation
(the uniform distribution) and that the perceived distance
between a probability and uniformity is influenced by the shape
(relative entropy) of the distribution that the probability is
embedded in. These intuitions are formalized in a novel
probability weighting function, VWD(p), which is simpler and
has less parameters than existing probability weighting
functions. The proposed probability weighting function
captures characteristic features of existing probability
weighting functions, introduces novel predictions, and
provides a parsimonious account of findings in probability and
frequency estimation related tasks.