Nonlinear Dendritic Dynamics and their Effect on the Information Processing Capabilities of Neurons /
- Author(s): Saad Khoury, Helen G.;
- et al.
A major challenge in neuroscience is to reverse engineer the brain and understand its information processing and learning capabilities. While the pace of discovery and untangling of the brain's staggering dynamics is advancing at unprecedented speed especially with the recently developed tools and imaging techniques, this advancement is not devoid of risk: the arsenal of novel techniques carries a huge mass of data that may complicate further the unraveling of brain function. Is every ion channel, every spine, every dendrite, every neuron and every synaptic connection necessary to achieve the computational capabilities of the central nervous system? Answering this question rises the need for a two-way communication between experiments and mathematical theoretical work. Neural networks composed of point neurons and endowed with biologically inspired synaptic learning rules have been successfully applied to a variety of challenging learning- related tasks, namely in problems of pattern recognition, associative memory, map formation, among others. While these networks are good at tasks they are built for, there still exists a gap separating us from fully understanding how the brain is good at the large multitude of tasks it can perform. When we reflect upon most efforts in building and simulating neural networks, we ask ourselves about the appropriate scale for modeling: Given the complexity of the nervous system, is it enough to model the neurons as point-like units in which a weighted sum of synaptic inputs is passed through a single spike-generating mechanism? From a modern perspective, the point neuron seems likely to be a poor representation of synaptic integration in neurons with large, profusely branched, active dendrites that populate brain structures associated with advanced cognitive functions and learning. These dendrites are endowed with nonlinear active conductances that modulate synaptic integration and somatic activity. Does the increased nonlinearity at the level of the neuron enhance the computational power of the neuron, and that of the network? In an effort to find answers to these questions, we implemented a simplified mathematical model of a pyramidal neuron endowed with complex dendritic dynamics and quantified its information processing capabilities using Shannon theory of mutual information. We proved that a neuron that holds multiple sites of independent thresholding of synaptic inputs and passive and active forward and backward propagation along with backpropagating action potential activated calcium spike firing and coincidence detection has a higher capacity for information processing than a point neuron and a network of two point neurons. The advantage in information processing, coupled with the simplicity and scalability of the neuron model implemented, constitute a compelling enough reason to promote the usage of such a spatially extended neuron model in networks that undergo plasticity and learning