Skip to main content
eScholarship
Open Access Publications from the University of California

Syntactic Systematicity Arising from Semantic Predictions in a Hebbian-Competitive Network

Abstract

A Hebbiein-inspired, competitive network is presented which learns to predict the typical semantic features of denoting terms in simple and moderately complex sentences. In addition, the network learns to predict the appearance of syntactically key words, such as prepositions and relative pronouns. Importantly, as a by-product of the network's semantic training, a strong form of syntactic systematicity emerges. Moreover, the network can integrate novel nouns and verbs into its training process. This is achieved by assigning predicted semantic features as a default meaning when a novel word is encountered. All network training is unsupervised with respect to error feedback. Issues addressed here have been the subject of debate by notable psychologists, philosophers, and linguists within the last decade.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View