With PQDT Open, you can read the full text of open access dissertations and theses free of charge.
About PQDT Open
Search
COMING SOON! PQDT Open is getting a new home!
ProQuest Open Access Dissertations & Theses will remain freely available as part of a new and enhanced search experience at www.proquest.com.
Questions? Please refer to this FAQ.
This thesis reports on experiments aimed at explaining why machine learning algorithms using the greedy stochastic gradient descent (SGD) algorithm sometimes generalize better than algorithms using other optimization techniques. We propose two hypothesis, namely the "canyon effect" and the “classification insensitivity'', and illustrate them with two data sources. On these data sources, SGD generalizes more accurately than SVMperf, which performs more intensive optimization, over a wide variety of choices of the regularization parameters. Finally, we report on some similar, but predictably less dramatic, effects on natural data.
Advisor: | Helmbold, David P. |
Commitee: | Long, Philip M., Warmuth, Manfred K. |
School: | University of California, Santa Cruz |
Department: | Computer Science |
School Location: | United States -- California |
Source: | MAI 50/05M, Masters Abstracts International |
Source Type: | DISSERTATION |
Subjects: | Computer science |
Keywords: | Classifications, Learning, Optimizations, Stochastic gradient descent |
Publication Number: | 1508167 |
ISBN: | 978-1-267-26200-4 |