Random recurrent networks facilitate the tractable analysis of large networks. The spectrum of the connectivity matrix, determined analytically by random matrix techniques, determines the network’s linear dynamics as well as the stability of the nonlinear dynamics. Knowledge of the onset of chaos helps determine the networks computational capabilities and memory capacity. However, fully homogeneous random networks lack the non-trivial structures found in real world networks, such as cell-types and plasticity induced correlations in neural networks. We address this deficiency by investigating the impact of correlations between forward and reverse connections, which may depend on the neuronal type. Using random matrix theory, we derive a formula that efficiently computes the eigenvalue spectrum of large random matrices with block-structured correlations. The inclusion of structured correlations distorts the eigenvalue distribution in a nontrivial way; the distribution is neither a circle nor an ellipse. We find that layered networks with strong interlayer correlations have gapped spectra. For antisymmetric layered networks, oscillatory modes dominate the linear dynamics.
We analyze the effect of structured correlations on the nonlinear dynamics of rate networks by developing a set of dynamical mean field equations applicable for large system sizes. We find that the power spectrum of strongly antisymmetric bipartite networks peaks at nonzero frequency, miming the gap present in the eigenvalue distribution.
Heterogeneous connection statistics facilitate the presence of strongly feed-forward connections in addition to recurrent ones, both of which promote signal amplification. We investigate the role of feed-forward amplification in i.i.d. block-structured networks by computing the Fisher information of past input perturbations. We apply this result to find the optimal architecture for information retention in two populations, under energy constraints. We find that this architecture is both strongly feed-forward and recurrent, with the respective strengths of these connections depending on the available synaptic gain. Finally, we assess the ability of rate networks to dynamically approximate the dominant mode of a random symmetric matrix. Given an initial estimate of the eigenvector as input, we find that there is an optimal processing time and synaptic gain strength depending on the dimensionality and quality of the initial estimate.
|Advisor:||Sharpee, Tatyana O., Vergassola, Massimo|
|Commitee:||Arovas, Daniel, Sejnowski, Terrence J., Stevens, Charles F., Wu, Congjun|
|School:||University of California, San Diego|
|School Location:||United States -- California|
|Source:||DAI-B 81/7(E), Dissertation Abstracts International|
|Keywords:||Dynamical mean field theory, Fisher information, Neural networks, Non-hermitian, Random matrix theory|
Copyright in each Dissertation and Thesis is retained by the author. All Rights Reserved
The supplemental file or files you are about to download were provided to ProQuest by the author as part of a
dissertation or thesis. The supplemental files are provided "AS IS" without warranty. ProQuest is not responsible for the
content, format or impact on the supplemental file(s) on our system. in some cases, the file type may be unknown or
may be a .exe file. We recommend caution as you open such files.
Copyright of the original materials contained in the supplemental file is retained by the author and your access to the
supplemental files is subject to the ProQuest Terms and Conditions of use.
Depending on the size of the file(s) you are downloading, the system may take some time to download them. Please be