A new class of derivative-free optimization algorithms is developed to solve, with remarkable efficiency, a range of practical nonconvex optimization problems whose function evaluations are computationally (or experimentally) expensive. These new algorithms, which are provably convergent under the appropriate assumptions, are response surface methods which iteratively minimize metrics based on an interpolation of existing datapoints and a synthetic model of the uncertainty of this interpolant, which itself is built on the framework of a Delaunay triangulation over the existing datapoints. Unlike other response surface methods, our algorithms can employ any well-behaved interpolation strategy.
An important subproblem in α-DOGS is the estimation of the averaging process in stationary ergodic processes. A new approach for determining this quantity is also presented which is mathematically rigorous and numerically accurate. There are six main algorithms developed in this class thus far (only four of them are presented in this thesis) which address a wide range of practical optimization problems. The first algorithm, dubbed Δ-DOGS, efficiently minimizes expensive objective functions inside linearly-constrained feasible domains. The second algorithm, Δ-DOGS(C), extends Δ-DOGS to handle efficiently more general convex search domains. The third algorithm incorporates a grid into Δ-DOGS to achieve better convergence by performing fewer function evaluations at the boundary of feasibility. The fourth algorithm, dubbed α-DOGS, efficiently minimizes objective functions which are noisy, generally derived by taking the statistics of stationary and ergodic random processes. An important subproblem in α-DOGS is the estimation of the averaging process in stationary ergodic processes. A new approach for determining this quantity is also presented which is mathematically rigorous and numerically accurate.
Rigorous convergence analyses of all of the algorithms proposed are presented, and necessary conditions to guarantee convergence to the global minimum are provided. For validation, the algorithms proposed have been tested on both well-known benchmark optimization problems as well as some new application-oriented optimization problems in ship design; some illustrative results will be shown.
|Advisor:||Bewley, Thomas R.|
|Commitee:||De Oliveira, Mauricio, Del Alamo, Juan Carlos, Gill, Philip E., Meyer, David|
|School:||University of California, San Diego|
|Department:||Engineering Sciences (Mechanical Engineering)|
|School Location:||United States -- California|
|Source:||DAI-B 78/04(E), Dissertation Abstracts International|
|Keywords:||Delaunay triangulation, Derivative-free optimization, Global optimization, Stochastic optimization, Uncertainty quantification|
Copyright in each Dissertation and Thesis is retained by the author. All Rights Reserved
The supplemental file or files you are about to download were provided to ProQuest by the author as part of a
dissertation or thesis. The supplemental files are provided "AS IS" without warranty. ProQuest is not responsible for the
content, format or impact on the supplemental file(s) on our system. in some cases, the file type may be unknown or
may be a .exe file. We recommend caution as you open such files.
Copyright of the original materials contained in the supplemental file is retained by the author and your access to the
supplemental files is subject to the ProQuest Terms and Conditions of use.
Depending on the size of the file(s) you are downloading, the system may take some time to download them. Please be