Skip to main content
eScholarship
Open Access Publications from the University of California

UCSF

UC San Francisco Electronic Theses and Dissertations bannerUCSF

Auditory Processing and Perception in Songbirds

Abstract

Songbirds, like humans, learn to produce and to recognize complex, species-specific sounds, providing a biologically tractable model to study the neural mechanisms of speech production and perception. I used chronic recording from single neurons, and operant behavioral techniques to ask how complex sounds are represented in the songbird forebrain, and how this representation may be related to the birds' perception of song. I found that neurons in field L, the avian analog of the human primary auditory cortex, represent three different types of modulations found in natural sounds: spectral modulations, temporal modulations, and spectro-temporal modulations. Neurons specialized for different modulations have different physiological properties and are localized to different parts of field L. The response properties of these neurons depend nonlinearly on the average intensity of the stimulus. At high intensities, they respond only to differences in sound energy between nearby frequency or times, while at low intensities they integrate information from nearby frequencies and times. This nonlinearity is shared with the visual system and may represent a computational principle of sensory encoding. Finally, I used operant techniques to ask whether songbirds could generalize a learned song discrimination task to songs altered in pitch, duration, or volume. I found that birds generalized correctly to songs altered in duration but not to those altered in pitch or volume. These data suggest that birds use the spatial pattern of neurons activated by a song rather than the temporal pattern of neural activation to determine what song they heard.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View