Bayesian optimization for automated model selection

Gustavo Malkomes, Chip Schaff, Roman Garnett
; Proceedings of the Workshop on Automatic Machine Learning, PMLR 64:41-47, 2016.

Abstract

Despite the success of kernel-based nonparametric methods, kernel selection still requires considerable expertise, and is often described as a “black art.” We present a sophisticated method for automatically searching for an appropriate kernel from an infinite space of potential choices. Previous efforts in this direction have focused on traversing a kernel grammar, only examining the data via computation of marginal likelihood. Our proposed search method is based on Bayesian optimization in model space, where we reason about model evidence as a function to be maximized. We explicitly reason about the data distribution and how it induces similarity between potential model choices in terms of the explanations they can offer for observed data. In this light, we construct a novel kernel between models to explain a given dataset. Our method is capable of finding a model that explains a given dataset well without any human assistance, often with fewer computatio! ns of model evidence than previous approaches, a claim we demonstrate empirically.

Cite this Paper


BibTeX
@InProceedings{pmlr-v64-malkomes_bayesian_2016, title = {Bayesian optimization for automated model selection}, author = {Gustavo Malkomes and Chip Schaff and Roman Garnett}, pages = {41--47}, year = {2016}, editor = {Frank Hutter and Lars Kotthoff and Joaquin Vanschoren}, volume = {64}, series = {Proceedings of Machine Learning Research}, address = {New York, New York, USA}, month = {24 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v64/malkomes_bayesian_2016.pdf}, url = {http://proceedings.mlr.press/v64/malkomes_bayesian_2016.html}, abstract = {Despite the success of kernel-based nonparametric methods, kernel selection still requires considerable expertise, and is often described as a “black art.” We present a sophisticated method for automatically searching for an appropriate kernel from an infinite space of potential choices. Previous efforts in this direction have focused on traversing a kernel grammar, only examining the data via computation of marginal likelihood. Our proposed search method is based on Bayesian optimization in model space, where we reason about model evidence as a function to be maximized. We explicitly reason about the data distribution and how it induces similarity between potential model choices in terms of the explanations they can offer for observed data. In this light, we construct a novel kernel between models to explain a given dataset. Our method is capable of finding a model that explains a given dataset well without any human assistance, often with fewer computatio! ns of model evidence than previous approaches, a claim we demonstrate empirically.} }
Endnote
%0 Conference Paper %T Bayesian optimization for automated model selection %A Gustavo Malkomes %A Chip Schaff %A Roman Garnett %B Proceedings of the Workshop on Automatic Machine Learning %C Proceedings of Machine Learning Research %D 2016 %E Frank Hutter %E Lars Kotthoff %E Joaquin Vanschoren %F pmlr-v64-malkomes_bayesian_2016 %I PMLR %J Proceedings of Machine Learning Research %P 41--47 %U http://proceedings.mlr.press %V 64 %W PMLR %X Despite the success of kernel-based nonparametric methods, kernel selection still requires considerable expertise, and is often described as a “black art.” We present a sophisticated method for automatically searching for an appropriate kernel from an infinite space of potential choices. Previous efforts in this direction have focused on traversing a kernel grammar, only examining the data via computation of marginal likelihood. Our proposed search method is based on Bayesian optimization in model space, where we reason about model evidence as a function to be maximized. We explicitly reason about the data distribution and how it induces similarity between potential model choices in terms of the explanations they can offer for observed data. In this light, we construct a novel kernel between models to explain a given dataset. Our method is capable of finding a model that explains a given dataset well without any human assistance, often with fewer computatio! ns of model evidence than previous approaches, a claim we demonstrate empirically.
RIS
TY - CPAPER TI - Bayesian optimization for automated model selection AU - Gustavo Malkomes AU - Chip Schaff AU - Roman Garnett BT - Proceedings of the Workshop on Automatic Machine Learning PY - 2016/12/04 DA - 2016/12/04 ED - Frank Hutter ED - Lars Kotthoff ED - Joaquin Vanschoren ID - pmlr-v64-malkomes_bayesian_2016 PB - PMLR SP - 41 DP - PMLR EP - 47 L1 - http://proceedings.mlr.press/v64/malkomes_bayesian_2016.pdf UR - http://proceedings.mlr.press/v64/malkomes_bayesian_2016.html AB - Despite the success of kernel-based nonparametric methods, kernel selection still requires considerable expertise, and is often described as a “black art.” We present a sophisticated method for automatically searching for an appropriate kernel from an infinite space of potential choices. Previous efforts in this direction have focused on traversing a kernel grammar, only examining the data via computation of marginal likelihood. Our proposed search method is based on Bayesian optimization in model space, where we reason about model evidence as a function to be maximized. We explicitly reason about the data distribution and how it induces similarity between potential model choices in terms of the explanations they can offer for observed data. In this light, we construct a novel kernel between models to explain a given dataset. Our method is capable of finding a model that explains a given dataset well without any human assistance, often with fewer computatio! ns of model evidence than previous approaches, a claim we demonstrate empirically. ER -
APA
Malkomes, G., Schaff, C. & Garnett, R.. (2016). Bayesian optimization for automated model selection. Proceedings of the Workshop on Automatic Machine Learning, in PMLR 64:41-47

Related Material