Skip to main content
eScholarship
Open Access Publications from the University of California

Analogy as Nonparametric Bayesian Inference over Relational Systems

Abstract

Much of human learning and inference can be framed withinthe computational problem of relational generalization. Inthis project, we propose a Bayesian model that generalizesrelational knowledge to novel environments by analogicallyweighting predictions from previously encountered relationalstructures. First, we show that this learner outperforms anaive, theory-based learner on relational data derived fromrandom- and Wikipedia-based systems when experience withthe environment is small. Next, we show how our formal-ization of analogical similarity translates to the selection andweighting of analogies. Finally, we combine the analogy-and theory-based learners in a single nonparametric Bayesianmodel, and show that optimal relational generalizationtransitions from relying on analogies to building a theory ofthe novel system with increasing experience in it. Beyondpredicting unobserved interactions better than either baseline,this formalization gives a computational-level perspective onthe formation and abstraction of analogies themselves.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View