Skip to main content
eScholarship
Open Access Publications from the University of California

Decoding the computations of sensory neurons

  • Author(s): Kaardal, Joel Thomas
  • Advisor(s): Sharpee, Tatyana O
  • Abarbanel, Henry D I
  • et al.
Abstract

The nervous system encodes information about external stimuli through sophisticated computations performed by vast networks of sensory neurons. Since the space of all possible stimuli is much larger than the space of those that are ultimately meaningful,

dimensionality reduction techniques were developed to identify the subspace of stimulus space relevant to neural activity. However, dimensionality reduction methods provide limited insight into the nonlinear functions that build the nervous system’s internal model

of the world. In Chapter 2, the functional basis is introduced that transforms the relevant subspace to a basis that describes the computational function of the subunits that make up the neural circuitry. This functional basis is used to uncover novel insights about

the computations performed by neurons in low-level vision and, later on, high-level auditory circuitry. For the latter, significant barriers are found in the capability of current dimensionality reduction methods to recover the relevant subspaces of high-level sensory

neurons. This barrier is caused by the relative difficulty of stimulating high-level sensory neurons, which are often unresponsive to noise stimuli, while still maintaining a thorough exploration of the stimulus distribution. In response, a new approach to dimensionality

reduction is formulated in Chapter 3 called the low-rank maximum noise entropy method that makes it possible to overcome challenges presented by high-level sensory systems. In Chapter 4, functional bases derived from the relevant subspaces recovered by the

low-rank maximum noise entropy method are employed to study the neural computations performed by high-level auditory neurons.

Main Content
Current View