Skip to main content
eScholarship
Open Access Publications from the University of California

Compositional Neural Machine Translation by Removing the Lexicon from Syntax Tristan Thrush

Abstract

The meaning of a natural language utterance is largely determined from its syntax and words. Additionally, there isevidence from theories in semantics and neuroscience that humans process an utterance by separating some amount ofknowledge about the lexicon from the knowledge of word order. In this paper, we propose neural units that can enforcethis constraint over an LSTM encoder and decoder. We demonstrate that our model achieves competitive performanceacross a variety of domains including semantic parsing, syntactic parsing, and English to Mandarin Chinese translation. Inthese cases, our model outperforms the standard LSTM encoder and decoder architecture on many or all of our metrics. Todemonstrate that our model achieves a desired partial separation between the lexicon and syntax, we analyze its weightsand explore its behavior when different neural modules are damaged. When damaged, we find that the model displays theknowledge distortions that aphasics are evidenced to have.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View