Skip to main content
eScholarship
Open Access Publications from the University of California

A Graph Propagation Architecture for Massively-Parallel Constraint Processing of Natural language

Abstract

We describe a model of natural language understanding based on the notion of propagating constraints in a semantic memory. This model contains a massively-parallel memory-network in which constraint graphs that represent syntactic and other constraints that are associated awith the nodes that triggered activations are propagated. The propagated constraint graphs of complement nodes that coUide with the constraint graphs postulated by the head nodes are unified to perform constraint applications. This mechanism handles linguistic phenomena such as case, agreement, binding and control in a principled manner in effect equivalent to the manner that they are handled in modern linguistic theories.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View