Dissertation/Thesis Abstract

Attention to Deep Structure in Recurrent Neural Networks
by Sharpe, Spencer S., M.S., University of Wyoming, 2017, 62; 10619110
Abstract (Summary)

Deep recurrent networks can build complex representations of sequential data, enabling them to do things such as translate text from one language into another. These networks often utilize an attention mechanism that allows them to focus on important input representations. Network attention can be used to provide insight into network strategies for identifying structure in the input sequence. This study investigates attention in a deep recurrent network, trained to annotate text, to determine how the distribution of information in the text affects learning and network performance. Results suggest that reversing the input text makes it difficult for the network to discover higher-order linguistic structure. This study contributes to an understanding of how our brains might process sequential data.

Indexing (document details)
Advisor: Wright, Cameron H.G., Barrett, Steve
Commitee: Prather, Jon, Sommer, Michael
School: University of Wyoming
Department: Neuroscience
School Location: United States -- Wyoming
Source: MAI 57/01M(E), Masters Abstracts International
Source Type: DISSERTATION
Subjects: Information science, Artificial intelligence, Computer science
Keywords: Artificial neural networks, Deep learning, Information entropy, Recurrent networks
Publication Number: 10619110
ISBN: 9780355537383
Copyright © 2019 ProQuest LLC. All rights reserved. Terms and Conditions Privacy Policy Cookie Policy
ProQuest