Skip to main content
eScholarship
Open Access Publications from the University of California

UC San Diego

UC San Diego Electronic Theses and Dissertations bannerUC San Diego

Functional and neural organization underlying face and facial expression perception

Abstract

Users of American Sign Language (ASL) must recognize certain non-affective facial expressions as linguistic markers that signal distinct lexical and syntactic structures. When perceiving visual linguistic input, ASL signers must be able to quickly identify and discriminate between different linguistic and affective facial expressions in order to process and interpret signed sentences. Thus, signers have a very different perceptual and cognitive experience with the human face compared to nonsigners. This dissertation examines, in three separate studies, how experience with American Sign Language leads to changes in the processing and neural organization for the perception of faces and facial expressions. Study 1 examines face processing in Deaf signers and hearing nonsigners and revealed that these groups do not differ in face recognition or gestalt face processing ability; however, ASL signers exhibit a superior ability to detect subtle differences in local facial features. Study 2 examines whether affective facial expression categorical perception (CP) also extends to ASL facial expressions, and whether lifelong experience with ASL affects CP for affective facial expressions. Deaf signers and hearing nonsigners performed ABX discrimination and identification tasks on morphed affective and linguistic facial expression continua. Significant CP effects were observed in hearing nonsigners for both affective and linguistic facial expressions. Deaf signers, however, showed a significant CP effect only for linguistic facial expressions. Study 3 examines facial expression perception using functional magnetic resonance imaging (fMRI) and found that activation within the superior temporal sulcus (STS) for emotional expressions was right lateralized for hearing nonsigners and bilateral for deaf signers. In contrast, activation within STS for linguistic facial expressions was left lateralized only for signers and only when linguistic facial expressions co-occurred with verbs. Within the fusiform gyrus (FG), activation was left lateralized for ASL signers for both expression types, whereas activation was bilateral for both expression types for nonsigners. Taken together, the results demonstrate that linguistic processing of facial expressions leads to specific changes in the processing of and neural organization for human faces and facial expressions

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View