[edit]
Syne Tune: A Library for Large Scale Hyperparameter Tuning and Reproducible Research
Proceedings of the First International Conference on Automated Machine Learning, PMLR 188:16/1-23, 2022.
Abstract
We present Syne Tune, a library for large-scale distributed hyperparameter optimization (HPO). Syne Tune’s modular architecture allows users to easily switch between different execution backends to facilitate experimentation and makes it easy to contribute new optimization algorithms. To foster reproducible benchmarking, Syne Tune provides an efficient simulator backend and a benchmarking suite, which are essential for large-scale evaluations of distributed asynchronous HPO algorithms on tabulated and surrogate benchmarks. We showcase these functionalities with a range of state-of-the-art gradient-free optimizers, including multi-fidelity and transfer learning approaches on popular benchmarks from the literature. Additionally, we demonstrate the benefits of Syne Tunefor constrained and multi-objective HPO applications through two use cases: the former considers hyperparameters that induce fair solutions and the latter automatically selects machine types along with the conventional hyperparameters.