Information Maximization in Early Sensory Systems
Information maximization is a strong candidate for the design principles of early sensory systems. Yet, previous applications of information maximization are mostly restricted to linear or small neural systems due to difficulty in computing mutual information. To solve this problem, we developed a method that could efficiently compute mutual information provided about high dimensional inputs by responses of a large neural population.
Using our method, we first quantify information transmission by multiple overlapping retinal ganglion cell mosaics. The results reveal a transition where one high-density mosaic becomes less informative than two or more overlapping lower-density mosaics. The results explain differences in the fractions of multiple cell types and predict the existence of new retinal ganglion cell subtypes.
We then apply our method to neurons receiving time-varying stimuli and producing spike trains. Surprisingly, we found that the optimal nonlinearity for neurons receiving temporal corre- lated signal has finite slope, quantitatively explaining the ubiquitous sigmoid shape nonlinearity observed in neurons. The optimal nonlinearities we predicted agree well with experimental data without any parameters in our model.
We further investigate the optimal network connectivity for information transmission. Using olfactory system as a model, we analytically compute the optimal connectivity rate that maximize information transmission. The optimal connectivity rate has suprisingly simple expression and is inverse proportional to the input pattern sparsity. Our model also provides a feedforward solution to reconstruct odor signal. Our architecture is shown to be efficient, robust, and account for a number of experimental observations.