COMING SOON! PQDT Open is getting a new home!

ProQuest Open Access Dissertations & Theses will remain freely available as part of a new and enhanced search experience at

Questions? Please refer to this FAQ.

Dissertation/Thesis Abstract

Attention to Deep Structure in Recurrent Neural Networks
by Sharpe, Spencer S., M.S., University of Wyoming, 2017, 62; 10619110
Abstract (Summary)

Deep recurrent networks can build complex representations of sequential data, enabling them to do things such as translate text from one language into another. These networks often utilize an attention mechanism that allows them to focus on important input representations. Network attention can be used to provide insight into network strategies for identifying structure in the input sequence. This study investigates attention in a deep recurrent network, trained to annotate text, to determine how the distribution of information in the text affects learning and network performance. Results suggest that reversing the input text makes it difficult for the network to discover higher-order linguistic structure. This study contributes to an understanding of how our brains might process sequential data.

Indexing (document details)
Advisor: Wright, Cameron H.G., Barrett, Steve
Commitee: Prather, Jon, Sommer, Michael
School: University of Wyoming
Department: Neuroscience
School Location: United States -- Wyoming
Source: MAI 57/01M(E), Masters Abstracts International
Subjects: Information science, Artificial intelligence, Computer science
Keywords: Artificial neural networks, Deep learning, Information entropy, Recurrent networks
Publication Number: 10619110
ISBN: 978-0-355-53738-3
Copyright © 2021 ProQuest LLC. All rights reserved. Terms and Conditions Privacy Policy Cookie Policy