This study brings together Bayesian networks, topic models, hierarchical Bayes modeling and nonparametric Bayesian methods to build a framework for efficiently designing and implementing a family of (non)parametric Bayesian mixture models. Bayesian mixture models, including Bayesian topic models, have shown themselves to be a useful tool for modeling and discovering latent structure in a number of domains. We introduce a modeling framework, networks of mixture blocks, that brings together these developments in a way that facilitates the definition and implementation of complex (non)parametric Bayesian networks for data with partitioned structure. Networks of mixture blocks can be viewed as Bayesian networks that have been factored into a network of sub-models, mixture blocks, which are conditionally independent of each other given the introduction of auxiliary partition variables. We use this framework to develop several novel nonparametric Bayesian models for collaborative filtering and text modeling.
|Commitee:||Ihler, Alex, Stern, Hal|
|School:||University of California, Irvine|
|Department:||Information and Computer Science - Ph.D.|
|School Location:||United States -- California|
|Source:||DAI-B 71/06, Dissertation Abstracts International|
|Subjects:||Statistics, Artificial intelligence, Computer science|
|Keywords:||Bayesian networks, Machine learning, Mixture blocks, Nonparametric, Probabilistic, Topic models|
Copyright in each Dissertation and Thesis is retained by the author. All Rights Reserved
dissertation or thesis. The supplemental files are provided "AS IS" without warranty. ProQuest is not responsible for the
content, format or impact on the supplemental file(s) on our system. in some cases, the file type may be unknown or
may be a .exe file. We recommend caution as you open such files.
supplemental files is subject to the ProQuest Terms and Conditions of use.