Skip to main content
eScholarship
Open Access Publications from the University of California

Realtime integration of acoustic cues and semantic expectations in speechprocessing: Evidence from EEG

Abstract

A critical debate in speech perception concerns the stages of processing and their interactions. One source of evidence isthe timecourse over which different sources of information affect ongoing processing. We used electroencephalography(EEG) to ask when semantic expectations and acoustic cues are integrated neurophysiologically. Participants (N=31) heardtarget words from a voicing continuum (bark/park) in which both voice onset time (VOT) and preceding coarticulationwere manipulated. Targets were embedded in sentences predicting one phoneme or the other (Good dogs sometimes). Weused a component-independent analysis every 2 msec to determine when each cue affected the continuous EEG signal. Thisrevealed an early window (125-225 msec) sensitive exclusively to perceptual information (VOT), a later window (400-575msec) sensitive to semantic information, and a critical intermediate window (225-350 msec) when VOT and coarticulationare processed simultaneously with semantic expectations. This suggests continuous cascades and interactions betweenlower-level and higher-level processes.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View