Skip to main content
eScholarship
Open Access Publications from the University of California

Grounding Compositional Hypothesis Generation in Specific Instances

Abstract

number of recent computational models treat concept learn-ing as a form of probabilistic rule induction in a space oflanguage-like, compositional concepts. Inference in such mod-els frequently requires repeatedly sampling from a (infinite)distribution over possible concept rules and comparing theirrelative likelihood in light of current data or evidence. How-ever, we argue that most existing algorithms for top-down sam-pling are inefficient and cognitively implausible accounts ofhuman hypothesis generation. As a result, we propose analternative, Instance Driven Generator (IDG), that constructsbottom-up hypotheses directly out of encountered positive in-stances of a concept. Using a novel rule induction task basedon the children’s game Zendo, we compare these “bottom-up” and “top-down” approaches to inference. We find thatthe bottom-up IDG model accounts better for human infer-ences and results in a computationally more tractable inferencemechanism for concept learning models based on a probabilis-tic language of thought.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View