Skip to main content
eScholarship
Open Access Publications from the University of California

Head-Driven Massively-Parallel Constraint Propagation

Abstract

We will describe a model of natural language understanding based on Head-driven Massively-parallel Constraint Propagation (HMCP). This model contains a massively parallel memory network in which syntactic head-features are propagated along with other information concerning the nodes that triggered the propagation. The propagated head features eventually collide with subcategorization lists which contain constraints on subcategorized arguments.These mechanisms handle linguistic phenomena such as case, agreement, complement order, and control which are fundamental to linguistic analysis but have not been captured in previous marker-passing models.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View