Skip to main content
eScholarship
Open Access Publications from the University of California

Neural Language Models Capture Some, But Not All, Agreement AttractionEffects

Creative Commons 'BY' version 4.0 license
Abstract

The number of the subject in English must match the num-ber of the corresponding verb (dog runs but dogs run). Yetin real-time language production and comprehension, speak-ers often mistakenly compute agreement between the verb anda grammatically irrelevant non-subject noun phrase instead.This phenomenon, referred to as agreement attraction, is mod-ulated by a wide range of factors; any complete computationalmodel of grammatical planning and comprehension would beexpected to derive this rich empirical picture. Recent develop-ments in Natural Language Processing have shown that neuralnetworks trained only on word-prediction over large corporaare capable of capturing subject-verb agreement dependen-cies to a significant extent, but with occasional errors. In thispaper, we evaluate the potential of such neural word predic-tion models as a foundation for a cognitive model of real-timegrammatical processing. We use LSTMs, a common sequenceprediction model used to model language, to simulate six ex-periments taken from the agreement attraction literature. TheLSTMs captured the critical human behavior in three out of thesix experiments, indicating that (1) some agreement attractionphenomena can be captured by a generic sequence process-ing model, but (2) capturing the other phenomena may requiremodels with more language-specific mechanisms.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View