Despite the fact that the loss functions of deep neural networks are highly non-convex,
gradient-based optimization algorithms converge to approximately the same performance
from many random initial points. This makes neural networks easy to train, which, combined with their high representational capacity and implicit and explicit regularization
strategies, leads to machine-learned algorithms of high quality with reasonable computational cost in a wide variety of domains.
One thread of work has focused on explaining this phenomenon by numerically characterizing the local curvature at critical points of the loss function, where gradients are zero.
Such studies have reported that the loss functions used to train neural networks have no
local minima that are much worse than global minima, backed up by arguments from
random matrix theory. More recent theoretical work, however, has suggested that bad
local minima do exist.
In this dissertation, we show that one cause of this gap is that the methods used to
numerically find critical points of neural network losses suffer, ironically, from a bad local
minimum problem of their own. This problem is caused by gradient-flat points, where
the gradient vector is in the kernel of the Hessian matrix of second partial derivatives.
At these points, the loss function becomes, to second order, linear in the direction of the
gradient, which violates the assumptions necessary to guarantee convergence for secondorder critical point-finding methods. We present evidence that approximately gradient-flat
points are a common feature of several prototypical neural network loss functions.
|Advisor:||Bouchard, Kristofer E|
|Commitee:||DeWeese, Michael, Olshausen, Bruno, Hardt, Mortiz|
|School:||University of California, Berkeley|
|School Location:||United States -- California|
|Source:||DAI-B 82/4(E), Dissertation Abstracts International|
|Subjects:||Computer science, Neurosciences, Applied Mathematics|
|Keywords:||Critical points, Deep learning, Neural networks, Optimization|
Copyright in each Dissertation and Thesis is retained by the author. All Rights Reserved
The supplemental file or files you are about to download were provided to ProQuest by the author as part of a
dissertation or thesis. The supplemental files are provided "AS IS" without warranty. ProQuest is not responsible for the
content, format or impact on the supplemental file(s) on our system. in some cases, the file type may be unknown or
may be a .exe file. We recommend caution as you open such files.
Copyright of the original materials contained in the supplemental file is retained by the author and your access to the
supplemental files is subject to the ProQuest Terms and Conditions of use.
Depending on the size of the file(s) you are downloading, the system may take some time to download them. Please be