Dissertation/Thesis Abstract

Weighted Distance Weighted Discrimination and pairwise variable selection for classification
by Qiao, Xingye, Ph.D., The University of North Carolina at Chapel Hill, 2010, 136; 3418734
Abstract (Summary)

Statistical machine learning has attracted a lot of attention in recent years due to its broad applications in various fields. The driving statistical problem that is common throughout this dissertation is classification. This dissertation covers two major topics in classification.

The first topic is weighted Distance Weighted Discrimination (weighted DWD or wDWD), an improved version of a recently proposed classification method. We show significant improvements are available in several situations. Using our proposed optimal weighting schemes, we show that wDWD is Fisher consistent under the overall misclassification criterion. In addition, we propose three alternative criteria and provide the corresponding optimal weights or adaptive weighting schemes for each of them. Mathematical validation of these ideas is established through the High-Dimensional, Low Sample-Size (HDLSS) asymptotic properties of wDWD. An important contribution is the weakening of the assumptions from Hall et al. (2005) and Ahn et al. (2007). We then extend the results to two classes. The HDLSS asymptotic properties of wDWD that we discuss here contain two results, one is about the misclassification rate of wDWD, the other explores the angle between the DWD direction and the optimal classification direction.

The second topic of this dissertation is variable selection for classification. The goal is to find those variables that have weak marginal effects, but can lead to good classification results when they are viewed jointly. To accomplish this, we use a within-class permutation test called Significance test of Joint Effect (SigJEff). The resulting object of SigJEff is a set of pairs of variables with statistically significant joint effects. To extend our scope to joint effects with more than two variables, we introduce a new visualization approach to display the mutiscale joint effects, called Multiscale Significance Display (MSD), and a general framework for variable selection procedures based on MSD, called Multiscale Variable Screening (MVS). MSD is a moving window approach, and it evaluates the joint effects of the variables in this window. The moving window is based on an order of variables. MVS seeks to find the best initial ordering in an iterative manner.

Indexing (document details)
Advisor: Marron, James S., Liu, Yufeng
Commitee: Kelly, Douglas G., Liu, Yufeng, Marron, James S., Nobel, Andrew B., Zhang, Hao H.
School: The University of North Carolina at Chapel Hill
Department: Statistics
School Location: United States -- North Carolina
Source: DAI-B 71/09, Dissertation Abstracts International
Source Type: DISSERTATION
Subjects: Statistics
Keywords: Linear discrimination, Low sample-size data, Nonstandard asymptotics, Optimal weights, Pairwise, Variable selection, WDWD
Publication Number: 3418734
ISBN: 9781124172866
Copyright © 2018 ProQuest LLC. All rights reserved. Terms and Conditions Privacy Policy Cookie Policy
ProQuest