Skip to main content
eScholarship
Open Access Publications from the University of California

Exploring Lexical Relations in BERT using Semantic Priming

Abstract

BERT is a language processing model trained for word prediction in context, which has shown impressive performancein natural language processing tasks. However, the principles underlying BERT’s use of linguistic cues present in contextare yet to be fully understood. In this work, we develop tests informed by the semantic priming paradigm to investigateBERTs handling of lexical relations to complete a cloze task (Taylor, 1953). We define priming to be an increase in BERTsexpectation for a target word (pilot), in a context (e.g., I want to be a ), when the context is prepended with a relatedword (airplane) as opposed to an unrelated one (table). We explore BERTs priming behavior under various predictiveconstraints placed on the blank, and find that BERT is sensitive to lexical priming effects only under minimal constraintfrom the input context. This pattern was found to be consistent across diverse lexical relations.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View