Skip to main content
eScholarship
Open Access Publications from the University of California

Interaction with Context During Recurrent Neural Network Sentence Processing

Creative Commons 'BY' version 4.0 license
Abstract

Syntactic ambiguities in isolated sentences can lead to in-creased difficulty in incremental sentence processing, a phe-nomenon known as a garden-path effect. This difficulty, how-ever, can be alleviated for humans when they are presentedwith supporting discourse contexts. We tested whether re-current neural network (RNN) language models (LMs) couldlearn linguistic representations that are similarly influenced bydiscourse context. RNN LMs have been claimed to learn avariety of syntactic constructions. However, recent work hassuggested that pragmatically conditioned syntactic phenomenaare not acquired by RNNs. In comparing model behavior tohuman behavior, we show that our models can, in fact, learnpragmatic constraints that alleviate garden-path effects giventhe correct training and testing conditions. This suggests thatsome aspects of linguistically relevant pragmatic knowledgecan be learned from distributional information alone.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View