Alongside derivative-based methods, which scale better to higher-dimensional problems, derivative-free methods play an essential role in the optimization of many practical engineering systems, especially those in which function evaluations are determined by statistical averaging, and those for which the function of interest is nonconvex in the adjustable parameters.
This work focuses on the development of a new family of surrogate-based derivative-free optimization schemes, namely $\Delta$-DOGS schemes. The idea unifying this efficient and (under the appropriate assumptions) provably-globally-convergent family of schemes is the minimization of a search function which linearly combines a computationally inexpensive ''surrogate`` (that is, an interpolation, or in some cases a regression, of recent function evaluations - we generally favor some variant of polyharmonic splines for this purpose), to summarize the trends evident in the data available thus far, with a synthetic piecewise-quadratic ''uncertainty function`` (built on the framework of a Delaunay triangulation of existing datapoints), to characterize the reliability of the surrogate by quantifying the distance of any given point in parameter space to the nearest function evaluations.
This thesis introduces a handful of new schemes in the $\Delta$-DOGS family:
(a) $\Delta$-DOGS($\Omega$) designs for nonconvex (even, disconnected) feasible domains defined by computable constraint functions within a bound search domain.
(b) $\Delta$-DOGS($\Omega_Z$) accelerates the convergence of $\Delta$-DOGS($\Omega$) by restricting function evaluation coordinations at each iteration to lie on a Cartesian grid in a bounded search domain. The grid is successively refined as convergence is approached.
(c) gradient-based acceleration of $\Delta$-DOGS combines derivative-free global exploration with derivative-based local refinement.
(d) $\Delta$-DOGS($\Lambda$) modifies the idea of restricting function evaluations at each iteration to lie on a dense lattice (derived from an n-dimensional sphere packing) instead of a Cartesian grid. Moreover, it handles the linear constraint domain.
(e) $\alpha$-DOGSX designs to simultaneously increase the sampling time, and refine the numerical approximation, as convergence is approached.
This work also introduces a method to scale the parameter domain under consideration based on adaptive variation of the seen data in the optimization process, thereby obtaining a significantly smoother surrogate. This method is called the Multivariate Adaptive Polyharmonic Splines (MAPS) surrogate model.
Practical applications of these algorithms are explored that include
(A) the design of low-storage implicit/explicit Runge-Kutta (IMEXRK) schemes for high performance computing (HPC) problems such as the numerical simulation of turbulence flows, and (B) the design of airfoils and hydrofoils.
These algorithms have been compared with existing state-of-the-art algorithms, particularly the Surrogate Management Framework (SMF) using the Kriging model and Mesh Adaptive Direct Search (MADS), on both standard synthetic and computer-aided shape designs. We showed that in most cases, the new $\Delta$-DOGS algorithms outperform the existing ones.