COMING SOON! PQDT Open is getting a new home!

ProQuest Open Access Dissertations & Theses will remain freely available as part of a new and enhanced search experience at

Questions? Please refer to this FAQ.

Dissertation/Thesis Abstract

High-Order Automatic Differentiation of Unmodified Linear Algebra Routines via Nilpotent Matrices
by Dunham, Benjamin Z., Ph.D., University of Colorado at Boulder, 2017, 198; 10270502
Abstract (Summary)

This work presents a new automatic differentiation method, Nilpotent Matrix Differentiation (NMD), capable of propagating any order of mixed or univariate derivative through common linear algebra functions—most notably third-party sparse solvers and decomposition routines, in addition to basic matrix arithmetic operations and power series—without changing data-type or modifying code line by line; this allows differentiation across sequences of arbitrarily many such functions with minimal implementation effort. NMD works by enlarging the matrices and vectors passed to the routines, replacing each original scalar with a matrix block augmented by derivative data; these blocks are constructed with special sparsity structures, termed “stencils,” each designed to be isomorphic to a particular multidimensional hypercomplex algebra. The algebras are in turn designed such that Taylor expansions of hypercomplex function evaluations are finite in length and thus exactly track derivatives without approximation error.

Although this use of the method in the “forward mode” is unique in its own right, it is also possible to apply it to existing implementations of the (first-order) discrete adjoint method to find high-order derivatives with lowered cost complexity; for example, for a problem with N inputs and an adjoint solver whose cost is independent of N—i.e., O(1)—the N × N Hessian can be found in O(N) time, which is comparable to existing second-order adjoint methods that require far more problem-specific implementation effort. Higher derivatives are likewise less expensive—e.g., a N × N × N rank-three tensor can be found in O(N2). Alternatively, a Hessian-vector product can be found in O(1) time, which may open up many matrix-based simulations to a range of existing optimization or surrogate modeling approaches. As a final corollary in parallel to the NMD-adjoint hybrid method, the existing complex-step differentiation (CD) technique is also shown to be capable of finding the Hessian-vector product. All variants are implemented on a stochastic diffusion problem and compared in-depth with various cost and accuracy metrics.

Indexing (document details)
Advisor: Maute, Kurt K., Starkey, Ryan P.
Commitee: Argrow, Brian M., Doostan, Alireza, Fornberg, Bengt, Maute, Kurt K., Starkey, Ryan P.
School: University of Colorado at Boulder
Department: Aerospace Engineering
School Location: United States -- Colorado
Source: DAI-B 78/10(E), Dissertation Abstracts International
Subjects: Applied Mathematics, Mathematics, Aerospace engineering
Keywords: Adjoint, Automatic differentiation, Complex step, Hessian-vector, Linear algebra, Nilpotent element
Publication Number: 10270502
ISBN: 978-1-369-78533-3
Copyright © 2021 ProQuest LLC. All rights reserved. Terms and Conditions Privacy Policy Cookie Policy