Skip to main content
eScholarship
Open Access Publications from the University of California

UC San Diego

UC San Diego Electronic Theses and Dissertations bannerUC San Diego

Informative Hyper-parameter Optimization and Selection

Abstract

Hyper-parameter optimization methods allow efficient and robust hyperparameter search-ing without the need to hand-select each value and combination. Although hyper-parameter tuners, such as BOHB, Hyperopt, and SMAC have been investigated by researchers in terms of performance, there has yet to be an in-depth analysis of the values each tuner selected over alliterations. We propose a thorough aggregation of data in terms of the efficiency of the search values selected by each tuner over 59 datasets and ten popular ML algorithms from Scikit-learn. From this extensive data accumulated, we observe and advise which tuners show better results for particular datasets, through its meta-data, and algorithms. Through this research, we have also developed a simple plug-in for BOHB, Hyperopt, and SMAC into DARPA’s Data-driven discovery(D3M) Auto-ML systems for smooth implementation of various tuners. This is advantageous as the desired hyper-parameter tuner may change depending on the pipeline search method in anAuto-ML system, particularly when compared with Auto-ML systems that only utilize one search method. Our results show that for Auto-ML systems, the Hyperopt tuner will give more desirable results in a fewer amount of iterations due to the significant exploration component, and BOHB performs the best generally over a large number of datasets and algorithms owing to strategic budgeting.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View