With PQDT Open, you can read the full text of open access dissertations and theses free of charge.
About PQDT Open
Search
COMING SOON! PQDT Open is getting a new home!
ProQuest Open Access Dissertations & Theses will remain freely available as part of a new and enhanced search experience at www.proquest.com.
Questions? Please refer to this FAQ.
This thesis develops a new divergence that generalizes relative entropy and can be used to compare probability measures without a requirement of absolute continuity. We establish properties of the divergence, and in particular derive and exploit a representation as an infimum convolution of optimal transport cost and relative entropy. We include examples of computation and approximation of the divergence, and its applications in uncertainty quantification in discrete models and Gauss-Markov models.
Advisor: | Dupuis, Paul |
Commitee: | Yau, Shing-Tung, Yau, Horng-Tzer |
School: | Harvard University |
Department: | Mathematics |
School Location: | United States -- Massachusetts |
Source: | DAI-A 82/5(E), Dissertation Abstracts International |
Source Type: | DISSERTATION |
Subjects: | Mathematics, Information science |
Keywords: | Convex duality, KL divergence, Model uncertainty, Optimal transport theory, Relative entropy |
Publication Number: | 28092743 |
ISBN: | 9798698533917 |