Skip to main content
eScholarship
Open Access Publications from the University of California

Adults and preschoolers seek visual information to support language comprehension in noisy environments

Abstract

Language comprehension in grounded, social contexts in-volves integrating information from both the visual and the lin-guistic signals. But how should listeners prioritize these differ-ent information sources? Here, we test the hypothesis that evenyoung listeners flexibly adapt the dynamics of their gaze toseek higher value visual information when the auditory signalis less reliable. We measured the timing and accuracy of adults(n=31) and 3-5 year-old children’s (n=39) eye movements dur-ing a real-time language comprehension task. Both age groupsdelayed the timing of gaze shifts away from a speaker’s facewhen processing speech in a noisy environment. This delayresulted in listeners gathering more information from the vi-sual signal, more accurate gaze shifts, and fewer random eyemovements to the rest of the visual world. These results pro-vide evidence that even young listeners adjust to the demandsof different processing contexts by seeking out visual informa-tion that supports language comprehension.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View