Skip to main content
eScholarship
Open Access Publications from the University of California

Music, language, and gesture: Neural oscillations and relational cognition

Abstract

Music, language, and action involve the ability to combine and flexibly recombine sequences of discrete elements intohierarchical structures. Can structures in one domain influence the other? Does this sequential structure building processrely on shared neural resources or shared types of computation? Initially, we tracked a neural correlate of this sequentialstructure-building process in each domain individually using steady-state evoked potentials (SSEPs). We then exploredthe behavioral effect on sentence comprehension of mismatching linguistic phrase structures with metrical musical ones.We interpret our findings in terms of the Shared Syntactic Integration Resource Hypothesis. We extend the purview ofthis theory beyond harmonic syntax in music to considerations of how the mental organisation of musical elements in time(meter) can be considered syntactic. Our findings suggest fresh parallels between language and music, and how certainprocesses may be shared by more domain-general aspects of our cognitive architecture.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View