Pareto Frontier Learning with Expensive Correlated Objectives

Amar Shah, Zoubin Ghahramani
Proceedings of The 33rd International Conference on Machine Learning, PMLR 48:1919-1927, 2016.

Abstract

There has been a surge of research interest in developing tools and analysis for Bayesian optimization, the task of finding the global maximizer of an unknown, expensive function through sequential evaluation using Bayesian decision theory. However, many interesting problems involve optimizing multiple, expensive to evaluate objectives simultaneously, and relatively little research has addressed this setting from a Bayesian theoretic standpoint. A prevailing choice when tackling this problem, is to model the multiple objectives as being independent, typically for ease of computation. In practice, objectives are correlated to some extent. In this work, we incorporate the modelling of inter-task correlations, developing an approximation to overcome intractable integrals. We illustrate the power of modelling dependencies between objectives on a range of synthetic and real world multi-objective optimization problems.

Cite this Paper


BibTeX
@InProceedings{pmlr-v48-shahc16, title = {Pareto Frontier Learning with Expensive Correlated Objectives}, author = {Shah, Amar and Ghahramani, Zoubin}, booktitle = {Proceedings of The 33rd International Conference on Machine Learning}, pages = {1919--1927}, year = {2016}, editor = {Balcan, Maria Florina and Weinberger, Kilian Q.}, volume = {48}, series = {Proceedings of Machine Learning Research}, address = {New York, New York, USA}, month = {20--22 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v48/shahc16.pdf}, url = {https://proceedings.mlr.press/v48/shahc16.html}, abstract = {There has been a surge of research interest in developing tools and analysis for Bayesian optimization, the task of finding the global maximizer of an unknown, expensive function through sequential evaluation using Bayesian decision theory. However, many interesting problems involve optimizing multiple, expensive to evaluate objectives simultaneously, and relatively little research has addressed this setting from a Bayesian theoretic standpoint. A prevailing choice when tackling this problem, is to model the multiple objectives as being independent, typically for ease of computation. In practice, objectives are correlated to some extent. In this work, we incorporate the modelling of inter-task correlations, developing an approximation to overcome intractable integrals. We illustrate the power of modelling dependencies between objectives on a range of synthetic and real world multi-objective optimization problems.} }
Endnote
%0 Conference Paper %T Pareto Frontier Learning with Expensive Correlated Objectives %A Amar Shah %A Zoubin Ghahramani %B Proceedings of The 33rd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2016 %E Maria Florina Balcan %E Kilian Q. Weinberger %F pmlr-v48-shahc16 %I PMLR %P 1919--1927 %U https://proceedings.mlr.press/v48/shahc16.html %V 48 %X There has been a surge of research interest in developing tools and analysis for Bayesian optimization, the task of finding the global maximizer of an unknown, expensive function through sequential evaluation using Bayesian decision theory. However, many interesting problems involve optimizing multiple, expensive to evaluate objectives simultaneously, and relatively little research has addressed this setting from a Bayesian theoretic standpoint. A prevailing choice when tackling this problem, is to model the multiple objectives as being independent, typically for ease of computation. In practice, objectives are correlated to some extent. In this work, we incorporate the modelling of inter-task correlations, developing an approximation to overcome intractable integrals. We illustrate the power of modelling dependencies between objectives on a range of synthetic and real world multi-objective optimization problems.
RIS
TY - CPAPER TI - Pareto Frontier Learning with Expensive Correlated Objectives AU - Amar Shah AU - Zoubin Ghahramani BT - Proceedings of The 33rd International Conference on Machine Learning DA - 2016/06/11 ED - Maria Florina Balcan ED - Kilian Q. Weinberger ID - pmlr-v48-shahc16 PB - PMLR DP - Proceedings of Machine Learning Research VL - 48 SP - 1919 EP - 1927 L1 - http://proceedings.mlr.press/v48/shahc16.pdf UR - https://proceedings.mlr.press/v48/shahc16.html AB - There has been a surge of research interest in developing tools and analysis for Bayesian optimization, the task of finding the global maximizer of an unknown, expensive function through sequential evaluation using Bayesian decision theory. However, many interesting problems involve optimizing multiple, expensive to evaluate objectives simultaneously, and relatively little research has addressed this setting from a Bayesian theoretic standpoint. A prevailing choice when tackling this problem, is to model the multiple objectives as being independent, typically for ease of computation. In practice, objectives are correlated to some extent. In this work, we incorporate the modelling of inter-task correlations, developing an approximation to overcome intractable integrals. We illustrate the power of modelling dependencies between objectives on a range of synthetic and real world multi-objective optimization problems. ER -
APA
Shah, A. & Ghahramani, Z.. (2016). Pareto Frontier Learning with Expensive Correlated Objectives. Proceedings of The 33rd International Conference on Machine Learning, in Proceedings of Machine Learning Research 48:1919-1927 Available from https://proceedings.mlr.press/v48/shahc16.html.

Related Material