Skip to main content
Open Access Publications from the University of California


UCLA Previously Published Works bannerUCLA

Multisensory control of multimodal behavior: do the legs know what the tongue is doing?

  • Author(s): Cushman, Jesse D;
  • Aharoni, Daniel B;
  • Willers, Bernard;
  • Ravassard, Pascal;
  • Kees, Ashley;
  • Vuong, Cliff;
  • Popeney, Briana;
  • Arisaka, Katsushi;
  • Mehta, Mayank R
  • et al.

Understanding of adaptive behavior requires the precisely controlled presentation of multisensory stimuli combined with simultaneous measurement of multiple behavioral modalities. Hence, we developed a virtual reality apparatus that allows for simultaneous measurement of reward checking, a commonly used measure in associative learning paradigms, and navigational behavior, along with precisely controlled presentation of visual, auditory and reward stimuli. Rats performed a virtual spatial navigation task analogous to the Morris maze where only distal visual or auditory cues provided spatial information. Spatial navigation and reward checking maps showed experience-dependent learning and were in register for distal visual cues. However, they showed a dissociation, whereby distal auditory cues failed to support spatial navigation but did support spatially localized reward checking. These findings indicate that rats can navigate in virtual space with only distal visual cues, without significant vestibular or other sensory inputs. Furthermore, they reveal the simultaneous dissociation between two reward-driven behaviors.

Many UC-authored scholarly publications are freely available on this site because of the UC's open access policies. Let us know how this access is important for you.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View