Skip to main content
eScholarship
Open Access Publications from the University of California

Artificial Language Learning: Combining Syntax and Semantics

Creative Commons 'BY' version 4.0 license
Abstract

Artificial Grammar Learning (AGL) paradigms are a powerful method to study language learning and processing. How-ever, unlike natural languages, these tasks rely on grammars specifying relationships between meaningless stimuli with noreal-world referents. Therefore, learning is typically assessed based on grammaticality or familiarity judgements, assess-ing how well-formed a sequence is. We combined a meaningful vocabulary (in which nonsense words refer to propertiesof visual stimuli (colored shapes)) with different grammatical structures (adjacent, center-embedded, or crossed dependen-cies). Using an incremental, starting-small paradigm, participants were asked to interpret increasingly complex sequencesof nonsense words and select the set of visual stimuli that they described. High levels of learning were observed for allgrammars, including those which have previously been difficult to learn in traditional AGL paradigms. Here, the addi-tion of semantics not only allows closer comparisons to natural language but also aids learning, representing a valuableapproach to studying language learning.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View