Skip to main content
eScholarship
Open Access Publications from the University of California

UC San Diego

UC San Diego Electronic Theses and Dissertations bannerUC San Diego

Coordinated minds : how iconic co- speech gestures mediate communication

Abstract

Iconic co-speech gestures are spontaneous body movements produced in coordination with speaking. Concurrently with the word platter, for example, a speaker might trace an oval in the air, showing something about the shape of the object that he is conceptualizing. Chapter 2 investigates fundamental cognitive processes mediating gesture comprehension by comparing ERPs elicited by contextually congruent and incongruent gestures. Because gestures are not part of a conventionalized symbolic system, researchers have argued that they do not convey substantive content on their own; rather, their meaning is driven by speech that accompanies them. Chapter 3 tests these claims by measuring real time semantic activations prompted by iconic gestures presented in the absence of supporting context. To assess how gestures affect comprehension of discourse, EEG was recorded as healthy adults viewed short segments of spontaneous discourse involving both gestures and verbal utterances (Chapter 4). Discourse segments were followed either by related picture probes, which corresponded with information made available both in speech and gesture (Cross-modal Matches), or in speech alone (Speech-only Matches), or by unrelated controls. By comparing brain response to Cross-modal and Speech-only Matches, it is possible to assess the specificity of semantic activations during the integration of speech and gestures. Finally, Chapter 5 investigates whether iconic gestures engage object recognition processes implicated in the comprehension of conventionally depictive representations, such as photographs. "Static gestures" were created by extracting from each dynamic gesture a single freeze frame which made visible critical information about the speaker's intended meaning. EEG was recorded as participants viewed static and dynamic gestures, as well as photographs of common objects. The distribution and time course of ERP effects elicited by these stimulus types were compared

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View