Skip to main content
eScholarship
Open Access Publications from the University of California

UC San Diego

UC San Diego Electronic Theses and Dissertations bannerUC San Diego

Dynamics and Information Processing in Recurrent Networks

Abstract

Random recurrent networks facilitate the tractable analysis of large networks. The spectrum of the connectivity matrix, determined analytically by random matrix techniques, determines the network’s linear dynamics as well as the stability of the nonlinear dynamics. Knowledge of the onset of chaos helps determine the networks computational capabilities and memory capacity. However, fully homogeneous random networks lack the non-trivial structures found in real world networks, such as cell-types and plasticity induced correlations in neural networks. We address this deficiency by investigating the impact of correlations between forward and reverse connections, which may depend on the neuronal type. Using random matrix theory, we derive a formula that efficiently computes the eigenvalue spectrum of large random matrices with block-structured correlations. The inclusion of structured correlations distorts the eigenvalue distribution in a nontrivial way; the distribution is neither a circle nor an ellipse. We find that layered networks with strong interlayer correlations have gapped spectra. For antisymmetric layered networks, oscillatory modes dominate the linear dynamics.

We analyze the effect of structured correlations on the nonlinear dynamics of rate networks by developing a set of dynamical mean field equations applicable for large system sizes. We find that the power spectrum of strongly antisymmetric bipartite networks peaks at nonzero frequency, miming the gap present in the eigenvalue distribution. Heterogeneous connection statistics facilitate the presence of strongly feed-forward connections in addition to recurrent ones, both of which promote signal amplification. We investigate the role of feed-forward amplification in i.i.d. block-structured networks by computing the Fisher information of past input perturbations. We apply this result to find the optimal architecture for information retention in two populations, under energy constraints. We find that this architecture is both strongly feed-forward and recurrent, with the respective strengths of these connections depending on the available synaptic gain. Finally, we assess the ability of rate networks to dynamically approximate the dominant mode of a random symmetric matrix. Given an initial estimate of the eigenvector as input, we find that there is an optimal processing time and synaptic gain strength depending on the dimensionality and quality of the initial estimate.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View