Skip to main content
eScholarship
Open Access Publications from the University of California

Mixed reference frame representations underlie the use of multimodal sensory signals for reaching.

  • Author(s): McGuire, Leah Marie Medvick
  • Advisor(s): Sabes, Philip N
  • et al.
Abstract

The sensory signals that drive movement planning arrive in a variety of `reference frames', and integrating or comparing them requires sensory transformations. I set out to examine how the different forms of sensory signals, and the transformations needed to compare them, affect the representation and integration of multimodal sensory information. I used a combination of human psychophysics and electrophysiological recordings from rhesus macaques to examine how visual and proprioceptive information are used for reach planning and execution.

The human experiment was designed to exploit stereotyped patterns of gaze- dependent reach errors to determine whether the reference frame representations for reach planning depend on the visual and proprioceptive sensory information available. The results of this experiment were interpreted with a model of reach planning in which the statistical properties of sensory signals and their transformations determine how these signals are used. I found that no single reference frame representation was adequate to explain the observed error patterns when visual and proprioceptive information were varied. Only by integrating movement plans across multiple reference frame representations was the model able to capture the observed error patterns (Chapter 1).

Taking the results from this model, I next looked for evidence of these multiple reference frame representations across different sensory modalities in sensory-motor cortical areas of the rhesus macaque (Area 5 and MIP). I found that neurons in these areas use mixed reference frame representations, which are consistent across reaches to targets specified by different sensory modalities (visual and/or proprioceptive targets, Chapter 2). Additionally, I found that integration of multimodal sensory signals in Area 5 and MIP emerges primarily across the population response rather than within individual cells' responses (Chapter 3). These findings are consistent with the model results showing that sensory information is integrated in multiple reference frame representations, regardless of the reference frame in which sensory information enters the nervous system. These results illustrate one way that the brain can represent and integrate sensory information arriving from different sensory modalities.

Main Content
Current View