- Main
Interaction with Context During Recurrent Neural Network Sentence Processing
Abstract
Syntactic ambiguities in isolated sentences can lead to in-creased difficulty in incremental sentence processing, a phe-nomenon known as a garden-path effect. This difficulty, how-ever, can be alleviated for humans when they are presentedwith supporting discourse contexts. We tested whether re-current neural network (RNN) language models (LMs) couldlearn linguistic representations that are similarly influenced bydiscourse context. RNN LMs have been claimed to learn avariety of syntactic constructions. However, recent work hassuggested that pragmatically conditioned syntactic phenomenaare not acquired by RNNs. In comparing model behavior tohuman behavior, we show that our models can, in fact, learnpragmatic constraints that alleviate garden-path effects giventhe correct training and testing conditions. This suggests thatsome aspects of linguistically relevant pragmatic knowledgecan be learned from distributional information alone.
Main Content
Enter the password to open this PDF file:
-
-
-
-
-
-
-
-
-
-
-
-
-
-