Transfer Learning with Gaussian Processes for Bayesian Optimization

Petru Tighineanu, Kathrin Skubch, Paul Baireuther, Attila Reiss, Felix Berkenkamp, Julia Vinogradska
Proceedings of The 25th International Conference on Artificial Intelligence and Statistics, PMLR 151:6152-6181, 2022.

Abstract

Bayesian optimization is a powerful paradigm to optimize black-box functions based on scarce and noisy data. Its data efficiency can be further improved by transfer learning from related tasks. While recent transfer models meta-learn a prior based on large amount of data, in the low-data regime methods that exploit the closed-form posterior of Gaussian processes (GPs) have an advantage. In this setting, several analytically tractable transfer-model posteriors have been proposed, but the relative advantages of these methods are not well understood. In this paper, we provide a unified view on hierarchical GP models for transfer learning, which allows us to analyze the relationship between methods. As part of the analysis, we develop a novel closed-form boosted GP transfer model that fits between existing approaches in terms of complexity. We evaluate the performance of the different approaches in large-scale experiments and highlight strengths and weaknesses of the different transfer-learning methods.

Cite this Paper


BibTeX
@InProceedings{pmlr-v151-tighineanu22a, title = { Transfer Learning with Gaussian Processes for Bayesian Optimization }, author = {Tighineanu, Petru and Skubch, Kathrin and Baireuther, Paul and Reiss, Attila and Berkenkamp, Felix and Vinogradska, Julia}, booktitle = {Proceedings of The 25th International Conference on Artificial Intelligence and Statistics}, pages = {6152--6181}, year = {2022}, editor = {Camps-Valls, Gustau and Ruiz, Francisco J. R. and Valera, Isabel}, volume = {151}, series = {Proceedings of Machine Learning Research}, month = {28--30 Mar}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v151/tighineanu22a/tighineanu22a.pdf}, url = {https://proceedings.mlr.press/v151/tighineanu22a.html}, abstract = { Bayesian optimization is a powerful paradigm to optimize black-box functions based on scarce and noisy data. Its data efficiency can be further improved by transfer learning from related tasks. While recent transfer models meta-learn a prior based on large amount of data, in the low-data regime methods that exploit the closed-form posterior of Gaussian processes (GPs) have an advantage. In this setting, several analytically tractable transfer-model posteriors have been proposed, but the relative advantages of these methods are not well understood. In this paper, we provide a unified view on hierarchical GP models for transfer learning, which allows us to analyze the relationship between methods. As part of the analysis, we develop a novel closed-form boosted GP transfer model that fits between existing approaches in terms of complexity. We evaluate the performance of the different approaches in large-scale experiments and highlight strengths and weaknesses of the different transfer-learning methods. } }
Endnote
%0 Conference Paper %T Transfer Learning with Gaussian Processes for Bayesian Optimization %A Petru Tighineanu %A Kathrin Skubch %A Paul Baireuther %A Attila Reiss %A Felix Berkenkamp %A Julia Vinogradska %B Proceedings of The 25th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2022 %E Gustau Camps-Valls %E Francisco J. R. Ruiz %E Isabel Valera %F pmlr-v151-tighineanu22a %I PMLR %P 6152--6181 %U https://proceedings.mlr.press/v151/tighineanu22a.html %V 151 %X Bayesian optimization is a powerful paradigm to optimize black-box functions based on scarce and noisy data. Its data efficiency can be further improved by transfer learning from related tasks. While recent transfer models meta-learn a prior based on large amount of data, in the low-data regime methods that exploit the closed-form posterior of Gaussian processes (GPs) have an advantage. In this setting, several analytically tractable transfer-model posteriors have been proposed, but the relative advantages of these methods are not well understood. In this paper, we provide a unified view on hierarchical GP models for transfer learning, which allows us to analyze the relationship between methods. As part of the analysis, we develop a novel closed-form boosted GP transfer model that fits between existing approaches in terms of complexity. We evaluate the performance of the different approaches in large-scale experiments and highlight strengths and weaknesses of the different transfer-learning methods.
APA
Tighineanu, P., Skubch, K., Baireuther, P., Reiss, A., Berkenkamp, F. & Vinogradska, J.. (2022). Transfer Learning with Gaussian Processes for Bayesian Optimization . Proceedings of The 25th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 151:6152-6181 Available from https://proceedings.mlr.press/v151/tighineanu22a.html.

Related Material