Dissertation/Thesis Abstract

Dynamics and Information Processing in Recurrent Networks
by Kuczala, Alexander, Ph.D., University of California, San Diego, 2019, 112; 27541198
Abstract (Summary)

Random recurrent networks facilitate the tractable analysis of large networks. The spectrum of the connectivity matrix, determined analytically by random matrix techniques, determines the network’s linear dynamics as well as the stability of the nonlinear dynamics. Knowledge of the onset of chaos helps determine the networks computational capabilities and memory capacity. However, fully homogeneous random networks lack the non-trivial structures found in real world networks, such as cell-types and plasticity induced correlations in neural networks. We address this deficiency by investigating the impact of correlations between forward and reverse connections, which may depend on the neuronal type. Using random matrix theory, we derive a formula that efficiently computes the eigenvalue spectrum of large random matrices with block-structured correlations. The inclusion of structured correlations distorts the eigenvalue distribution in a nontrivial way; the distribution is neither a circle nor an ellipse. We find that layered networks with strong interlayer correlations have gapped spectra. For antisymmetric layered networks, oscillatory modes dominate the linear dynamics.

We analyze the effect of structured correlations on the nonlinear dynamics of rate networks by developing a set of dynamical mean field equations applicable for large system sizes. We find that the power spectrum of strongly antisymmetric bipartite networks peaks at nonzero frequency, miming the gap present in the eigenvalue distribution.

Heterogeneous connection statistics facilitate the presence of strongly feed-forward connections in addition to recurrent ones, both of which promote signal amplification. We investigate the role of feed-forward amplification in i.i.d. block-structured networks by computing the Fisher information of past input perturbations. We apply this result to find the optimal architecture for information retention in two populations, under energy constraints. We find that this architecture is both strongly feed-forward and recurrent, with the respective strengths of these connections depending on the available synaptic gain. Finally, we assess the ability of rate networks to dynamically approximate the dominant mode of a random symmetric matrix. Given an initial estimate of the eigenvector as input, we find that there is an optimal processing time and synaptic gain strength depending on the dimensionality and quality of the initial estimate.

Indexing (document details)
Advisor: Sharpee, Tatyana O., Vergassola, Massimo
Commitee: Arovas, Daniel, Sejnowski, Terrence J., Stevens, Charles F., Wu, Congjun
School: University of California, San Diego
Department: Physics
School Location: United States -- California
Source: DAI-B 81/7(E), Dissertation Abstracts International
Source Type: DISSERTATION
Subjects: Physics
Keywords: Dynamical mean field theory, Fisher information, Neural networks, Non-hermitian, Random matrix theory
Publication Number: 27541198
ISBN: 9781392636633
Copyright © 2020 ProQuest LLC. All rights reserved. Terms and Conditions Privacy Policy Cookie Policy
ProQuest