Dissertation/Thesis Abstract

Delaunay-based Global Optimization Algorithms and their Applications
by Beyhaghi, Pooriya, Ph.D., University of California, San Diego, 2016, 266; 10190624
Abstract (Summary)

A new class of derivative-free optimization algorithms is developed to solve, with remarkable efficiency, a range of practical nonconvex optimization problems whose function evaluations are computationally (or experimentally) expensive. These new algorithms, which are provably convergent under the appropriate assumptions, are response surface methods which iteratively minimize metrics based on an interpolation of existing datapoints and a synthetic model of the uncertainty of this interpolant, which itself is built on the framework of a Delaunay triangulation over the existing datapoints. Unlike other response surface methods, our algorithms can employ any well-behaved interpolation strategy.

An important subproblem in α-DOGS is the estimation of the averaging process in stationary ergodic processes. A new approach for determining this quantity is also presented which is mathematically rigorous and numerically accurate. There are six main algorithms developed in this class thus far (only four of them are presented in this thesis) which address a wide range of practical optimization problems. The first algorithm, dubbed Δ-DOGS, efficiently minimizes expensive objective functions inside linearly-constrained feasible domains. The second algorithm, Δ-DOGS(C), extends Δ-DOGS to handle efficiently more general convex search domains. The third algorithm incorporates a grid into Δ-DOGS to achieve better convergence by performing fewer function evaluations at the boundary of feasibility. The fourth algorithm, dubbed α-DOGS, efficiently minimizes objective functions which are noisy, generally derived by taking the statistics of stationary and ergodic random processes. An important subproblem in α-DOGS is the estimation of the averaging process in stationary ergodic processes. A new approach for determining this quantity is also presented which is mathematically rigorous and numerically accurate.

Rigorous convergence analyses of all of the algorithms proposed are presented, and necessary conditions to guarantee convergence to the global minimum are provided. For validation, the algorithms proposed have been tested on both well-known benchmark optimization problems as well as some new application-oriented optimization problems in ship design; some illustrative results will be shown.

Indexing (document details)
Advisor: Bewley, Thomas R.
Commitee: De Oliveira, Mauricio, Del Alamo, Juan Carlos, Gill, Philip E., Meyer, David
School: University of California, San Diego
Department: Engineering Sciences (Mechanical Engineering)
School Location: United States -- California
Source: DAI-B 78/04(E), Dissertation Abstracts International
Source Type: DISSERTATION
Subjects: Mechanical engineering
Keywords: Delaunay triangulation, Derivative-free optimization, Global optimization, Stochastic optimization, Uncertainty quantification
Publication Number: 10190624
ISBN: 9781369349108