I can tell you know a lot, although I'm not sure what: Modeling broad epistemic inference from minimal action
Inferences about other people's knowledge and beliefs are central to social interaction. In many situations, however, it's not possible to be sure what other people know because their behavior is consistent with a range of potential epistemic states. Nonetheless, this behavior can give us coarse intuitions about how much someone might know, even if we cannot pinpoint the exact nature of this knowledge. We present a computational model of this kind of broad epistemic-state inference, centered on the expectation that agents maximize epistemic utilities. We evaluate our model in a graded inference task where people had to infer how much an agent knew based on the actions they chose. Critically, the agent's behavior was always under-determined, but nonetheless contained information about how much knowledge they possessed. Our model captures nuanced patterns in participant judgments, revealing a quantitative capacity to infer amorphous knowledge from minimal behavioral evidence.