Dissertation/Thesis Abstract

BCAP: An Artificial Neural Network Pruning Technique to Reduce Overfitting
by Brantley, Kiante, M.S., University of Maryland, Baltimore County, 2016, 71; 10140605
Abstract (Summary)

Determining the optimal size of a neural network is complicated. Neural networks, with many free parameters, can be used to solve very complex problems. However, these neural networks are susceptible to overfitting. BCAP (Brantley-Clark Artificial Neural Network Pruning Technique) addresses overfitting by combining duplicate neurons in a neural network hidden layer, thereby forcing the network to learn more distinct features. We compare hidden units using the cosine similarity, and combine those that are similar with each other within a threshold ϵ. By doing so the co-adaption of the neurons in the network is reduced because hidden units that are highly correlated (i.e. similar) are combined. In this paper we show evidence that BCAP is successful in reducing network size while maintaining accuracy, or improving accuracy of neural networks during and after training.

Supplemental Files

Some files may require a special program or browser plug-in. More Information

Indexing (document details)
Advisor: Oates, Tim
Commitee: Chen, Jian, Clark, Greg
School: University of Maryland, Baltimore County
Department: Computer Science
School Location: United States -- Maryland
Source: MAI 55/05M(E), Masters Abstracts International
Subjects: Artificial intelligence, Computer science
Keywords: Deep learning, Neural network optimization, Neural networks, Pruning neural networks
Publication Number: 10140605
ISBN: 9781339960029
Copyright © 2019 ProQuest LLC. All rights reserved. Terms and Conditions Privacy Policy Cookie Policy