Syne Tune: A Library for Large Scale Hyperparameter Tuning and Reproducible Research

David Salinas, Matthias Seeger, Aaron Klein, Valerio Perrone, Martin Wistuba, Cedric Archambeau
Proceedings of the First International Conference on Automated Machine Learning, PMLR 188:16/1-23, 2022.

Abstract

We present Syne Tune, a library for large-scale distributed hyperparameter optimization (HPO). Syne Tune’s modular architecture allows users to easily switch between different execution backends to facilitate experimentation and makes it easy to contribute new optimization algorithms. To foster reproducible benchmarking, Syne Tune provides an efficient simulator backend and a benchmarking suite, which are essential for large-scale evaluations of distributed asynchronous HPO algorithms on tabulated and surrogate benchmarks. We showcase these functionalities with a range of state-of-the-art gradient-free optimizers, including multi-fidelity and transfer learning approaches on popular benchmarks from the literature. Additionally, we demonstrate the benefits of Syne Tunefor constrained and multi-objective HPO applications through two use cases: the former considers hyperparameters that induce fair solutions and the latter automatically selects machine types along with the conventional hyperparameters.

Cite this Paper


BibTeX
@InProceedings{pmlr-v188-salinas22a, title = {Syne Tune: A Library for Large Scale Hyperparameter Tuning and Reproducible Research}, author = {Salinas, David and Seeger, Matthias and Klein, Aaron and Perrone, Valerio and Wistuba, Martin and Archambeau, Cedric}, booktitle = {Proceedings of the First International Conference on Automated Machine Learning}, pages = {16/1--23}, year = {2022}, editor = {Guyon, Isabelle and Lindauer, Marius and van der Schaar, Mihaela and Hutter, Frank and Garnett, Roman}, volume = {188}, series = {Proceedings of Machine Learning Research}, month = {25--27 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v188/salinas22a/salinas22a.pdf}, url = {https://proceedings.mlr.press/v188/salinas22a.html}, abstract = {We present Syne Tune, a library for large-scale distributed hyperparameter optimization (HPO). Syne Tune’s modular architecture allows users to easily switch between different execution backends to facilitate experimentation and makes it easy to contribute new optimization algorithms. To foster reproducible benchmarking, Syne Tune provides an efficient simulator backend and a benchmarking suite, which are essential for large-scale evaluations of distributed asynchronous HPO algorithms on tabulated and surrogate benchmarks. We showcase these functionalities with a range of state-of-the-art gradient-free optimizers, including multi-fidelity and transfer learning approaches on popular benchmarks from the literature. Additionally, we demonstrate the benefits of Syne Tunefor constrained and multi-objective HPO applications through two use cases: the former considers hyperparameters that induce fair solutions and the latter automatically selects machine types along with the conventional hyperparameters.} }
Endnote
%0 Conference Paper %T Syne Tune: A Library for Large Scale Hyperparameter Tuning and Reproducible Research %A David Salinas %A Matthias Seeger %A Aaron Klein %A Valerio Perrone %A Martin Wistuba %A Cedric Archambeau %B Proceedings of the First International Conference on Automated Machine Learning %C Proceedings of Machine Learning Research %D 2022 %E Isabelle Guyon %E Marius Lindauer %E Mihaela van der Schaar %E Frank Hutter %E Roman Garnett %F pmlr-v188-salinas22a %I PMLR %P 16/1--23 %U https://proceedings.mlr.press/v188/salinas22a.html %V 188 %X We present Syne Tune, a library for large-scale distributed hyperparameter optimization (HPO). Syne Tune’s modular architecture allows users to easily switch between different execution backends to facilitate experimentation and makes it easy to contribute new optimization algorithms. To foster reproducible benchmarking, Syne Tune provides an efficient simulator backend and a benchmarking suite, which are essential for large-scale evaluations of distributed asynchronous HPO algorithms on tabulated and surrogate benchmarks. We showcase these functionalities with a range of state-of-the-art gradient-free optimizers, including multi-fidelity and transfer learning approaches on popular benchmarks from the literature. Additionally, we demonstrate the benefits of Syne Tunefor constrained and multi-objective HPO applications through two use cases: the former considers hyperparameters that induce fair solutions and the latter automatically selects machine types along with the conventional hyperparameters.
APA
Salinas, D., Seeger, M., Klein, A., Perrone, V., Wistuba, M. & Archambeau, C.. (2022). Syne Tune: A Library for Large Scale Hyperparameter Tuning and Reproducible Research. Proceedings of the First International Conference on Automated Machine Learning, in Proceedings of Machine Learning Research 188:16/1-23 Available from https://proceedings.mlr.press/v188/salinas22a.html.

Related Material