Skip to main content
eScholarship
Open Access Publications from the University of California

Powering up causal generalization: A model of human conceptual bootstrapping with adaptor grammars

Abstract

Human learning and generalization benefit from bootstrapping: we arrive at complex concepts by starting small and building upon past successes. In this paper, we examine a computational account of causal conceptual bootstrapping, and describe a novel experiment in which the sequence of training data results in a dramatic order effect: participants succeed in identifying a compound concept only after experiencing training data in a “helpful” order. Our computational model represents causal relations as reusable, modular programs, which can themselves be “chunked” and flexibly reused to tackle more complex tasks. Our specific approach is based in combinatory logic and adaptor grammars, building on previous theories that posit a “language of thought” for concept representation, but making the learning process more sensitive to a learner’s experiences than any particular choice of conceptual primitives. Crucially, we demonstrate that a caching mechanism like that used in adaptor grammars is key to explain human-like bootstrapping patterns in causal generalization.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View