Latent variable models have two basic components: a latent structure encoding a hypothesized complex pattern and an observation model capturing the data distribution. With the advancements in machine learning and increasing availability of resources, we are able to perform inference in deeper and more sophisticated latent variable models. In most cases, these models are designed with a particular application in mind; hence, they tend to have restrictive observation models. The challenge, surfaced with the increasing diversity of data sets, is to generalize these latent models to work with different data types. We aim to address this problem by utilizing exponential dispersion models (EDMs) and proposing mechanisms for incorporating them into latent structures. (Abstract shortened by ProQuest.)
|Advisor:||Schapire, Robert E., Engelhardt, Barbara E.|
|Commitee:||Blei, David M., Cuff, Paul, Ramadge, Peter J.|
|School Location:||United States -- New Jersey|
|Source:||DAI-B 78/06(E), Dissertation Abstracts International|
|Subjects:||Artificial intelligence, Computer science|
|Keywords:||Bregman divergence, Clustering, Exponential dispersion model, Machine learning, Matrix factorization, Missing data|
Copyright in each Dissertation and Thesis is retained by the author. All Rights Reserved
The supplemental file or files you are about to download were provided to ProQuest by the author as part of a
dissertation or thesis. The supplemental files are provided "AS IS" without warranty. ProQuest is not responsible for the
content, format or impact on the supplemental file(s) on our system. in some cases, the file type may be unknown or
may be a .exe file. We recommend caution as you open such files.
Copyright of the original materials contained in the supplemental file is retained by the author and your access to the
supplemental files is subject to the ProQuest Terms and Conditions of use.
Depending on the size of the file(s) you are downloading, the system may take some time to download them. Please be