Skip to main content
eScholarship
Open Access Publications from the University of California

A Continum of Induction Methods for Learning Probability Distributions with Generalization

Abstract

Probabilistic models of pattern completion have several advantages, namely, ability to handle arbitrary conceptual representations including compositional structures, and explicitness of distributional assumptions. However, a gap in the theory of induction of priors has hindered probabilistic modeling of cognitive generalization bitises. W e propose a family of methods parameterized along a value 7 that controls the degree to which the probability distribution being induced generalizes from the training set. The extremes of the 7-continuum correspond to relative frequency methods and extreme maximum entropy methods. The methods apply to a wide range of pattern representations including simple feature vectors as well as frame-like feature DAGs.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View