Dissertation/Thesis Abstract

Learning in chaotic recurrent neural networks
by Sussillo, David C., Ph.D., Columbia University, 2009, 134; 3346497
Abstract (Summary)

Training recurrent neural networks (RNNs) is a long-standing open problem both in theoretical neuroscience and machine learning. In particular, training chaotic RNNs was previously thought to be impossible. While some traditional methods for training RNNs exist, they are generally thought of as weak and typically fail on anything but the simplest of problems and smallest networks. We review previous methods such as gradient descent approaches and their problems, and we also review more recent approaches such as the Echostate Network and related ideas. We show that chaotic RNNs can be trained to generate multiple patterns. Further, we explain a novel supervised learning paradigm, which we call FORCE learning, that accomplishes the training. The network architectures we analyze, on the one extreme, include training only the input weights to a readout unit that has strong feedback to the network, and on the other extreme, involve generic learning of all synapses within the RNN. We present these models as potential networks for motor pattern generation that are able to learn multiple, high-dimensional patterns while coping with the complexities of a recurrent network that may have spontaneous, ongoing, and complex dynamics. We show an example of a single RNN that can generate the aperiodic dynamics of all 95 joint angles for both human walking and running motions captured via motion capture technology. Finally, we apply the learning techniques we developed for chaotic RNNs to a novel, unsupervised method for extracting predictable signals out of high-dimensional time series data, if such predictable signals exists.

Indexing (document details)
Advisor: Abbott, L. F.
Commitee:
School: Columbia University
School Location: United States -- New York
Source: DAI-B 70/02, Dissertation Abstracts International
Source Type: DISSERTATION
Subjects: Neurosciences, Computer science
Keywords: Neural networks, Recurrent neural networks
Publication Number: 3346497
ISBN: 978-1-109-01497-6
Copyright © 2019 ProQuest LLC. All rights reserved. Terms and Conditions Privacy Policy Cookie Policy
ProQuest