Boosting algorithms produce accurate predictors for complex phenomena by welding together collections of simple predictors. In the classical method AdaBoost, as well as its immediate variants, the welding points are determined by convex optimization; unlike typical applications of convex optimization in machine learning, however, the AdaBoost scheme eschews the usual regularization and constraints used to control numerical and statistical properties.
On the other hand, the data and simple predictors impose rigid structure on the behavior of AdaBoost variants, and moreover convex duality provides a lens to resolve this rigidity. This structure is fundamental to the properties of these methods, and in particular leads to numerical and statistical convergence rates.
|Commitee:||Chaudhuri, Kamalika, Fitzsimmons, Patrick, Freund, Yoav, Gill, Philip, Schapire, Robert|
|School:||University of California, San Diego|
|Department:||Computer Science and Engineering|
|School Location:||United States -- California|
|Source:||DAI-B 74/11(E), Dissertation Abstracts International|
|Subjects:||Statistics, Computer science|
|Keywords:||Boosting, Convex analysis, Data dependence, Duality, Machine learning|
Copyright in each Dissertation and Thesis is retained by the author. All Rights Reserved
The supplemental file or files you are about to download were provided to ProQuest by the author as part of a
dissertation or thesis. The supplemental files are provided "AS IS" without warranty. ProQuest is not responsible for the
content, format or impact on the supplemental file(s) on our system. in some cases, the file type may be unknown or
may be a .exe file. We recommend caution as you open such files.
Copyright of the original materials contained in the supplemental file is retained by the author and your access to the
supplemental files is subject to the ProQuest Terms and Conditions of use.
Depending on the size of the file(s) you are downloading, the system may take some time to download them. Please be