The field of n-dimensional sphere packings is elegant and mature in its mathematic development and characterization. However, practical application of this powerful body of work is lacking. The line of research presented in this work explores the application of sphere packings to the field of derivative-free optimization. Chapter 2 reviews the essential results available in this field, then extends these results by: (a) assembling a catalog of key properties of the principle dense and rare sphere packings and nets available, including hundreds of values not previously known; (b) introducing and characterizing several new families of regular rare sphere packings and nets; and (c) developing a new algorithm for efficient solution of discrete Thompson problems, restricted to nearest-neighbor points. These results are leveraged heavily in the applications addressed in Chapters 3 and 4. In particular, Chapter 3 builds from this presentation to develop a new algorithm for Lattice-Based Derivative-free Optimization via Global Surrogates (LABDOGS), leveraging dense sphere packings as an alternative to Cartesian grids to coordinate derivative-free searches. The LABDOGS algorithm provides a highly efficient, globally convergent optimization algorithm that requires nothing more than a definition of a feasible domain and a cost function handle. The LABDOGS algorithm demonstrates superior performance and convergence rates to its Cartesian-based competitors. Chapter 4 builds from the material of Chapter 2 and 3 to develop a highly efficient locally convergent derivative-free optimization algorithm called L-MADS, which builds from and improves upon the Mesh Adaptive Direct Search (MADS) class of optimization algorithms. The L-MADS algorithm offers an alternative to the Successive Polling substep of LABDOGS, providing a locally convergent pattern search algorithm that, unlike SP, offers good convergence behavior when challenging constraints on the feasible region are encountered. L-MADS inherits all the convergence characteristics of the best available MADS algorithms, while significantly improving convergence rates