Skip to main content
eScholarship
Open Access Publications from the University of California

Compression: A Lossless Mechanism for Learning Complex Structured Relational Representations

Abstract

People learn by both decomposing and combining concepts; most accounts of combination are either compositional or conjunctive. We augment the DORA model of representation learning to build new predicate representation by combining (or compressing) existing predicate representations (e.g., building a predicate a_b by combining predicates a and b). The resulting model learns structured relational representations from experience and then combines these relational concepts to form more complex, compressed concepts. We show that the resulting model provides an account of a category learning experiment in which categories are defined as novel combinations of relational concepts.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View