Learning to Optimize Combinatorial Functions

Nir Rosenfeld, Eric Balkanski, Amir Globerson, Yaron Singer
Proceedings of the 35th International Conference on Machine Learning, PMLR 80:4374-4383, 2018.

Abstract

Submodular functions have become a ubiquitous tool in machine learning. They are learnable from data, and can be optimized efficiently and with guarantees. Nonetheless, recent negative results show that optimizing learned surrogates of submodular functions can result in arbitrarily bad approximations of the true optimum. Our goal in this paper is to highlight the source of this hardness, and propose an alternative criterion for optimizing general combinatorial functions from sampled data. We prove a tight equivalence showing that a class of functions is optimizable if and only if it can be learned. We provide efficient and scalable optimization algorithms for several function classes of interest, and demonstrate their utility on the task of optimally choosing trending social media items.

Cite this Paper


BibTeX
@InProceedings{pmlr-v80-rosenfeld18a, title = {Learning to Optimize Combinatorial Functions}, author = {Rosenfeld, Nir and Balkanski, Eric and Globerson, Amir and Singer, Yaron}, booktitle = {Proceedings of the 35th International Conference on Machine Learning}, pages = {4374--4383}, year = {2018}, editor = {Dy, Jennifer and Krause, Andreas}, volume = {80}, series = {Proceedings of Machine Learning Research}, month = {10--15 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v80/rosenfeld18a/rosenfeld18a.pdf}, url = {https://proceedings.mlr.press/v80/rosenfeld18a.html}, abstract = {Submodular functions have become a ubiquitous tool in machine learning. They are learnable from data, and can be optimized efficiently and with guarantees. Nonetheless, recent negative results show that optimizing learned surrogates of submodular functions can result in arbitrarily bad approximations of the true optimum. Our goal in this paper is to highlight the source of this hardness, and propose an alternative criterion for optimizing general combinatorial functions from sampled data. We prove a tight equivalence showing that a class of functions is optimizable if and only if it can be learned. We provide efficient and scalable optimization algorithms for several function classes of interest, and demonstrate their utility on the task of optimally choosing trending social media items.} }
Endnote
%0 Conference Paper %T Learning to Optimize Combinatorial Functions %A Nir Rosenfeld %A Eric Balkanski %A Amir Globerson %A Yaron Singer %B Proceedings of the 35th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2018 %E Jennifer Dy %E Andreas Krause %F pmlr-v80-rosenfeld18a %I PMLR %P 4374--4383 %U https://proceedings.mlr.press/v80/rosenfeld18a.html %V 80 %X Submodular functions have become a ubiquitous tool in machine learning. They are learnable from data, and can be optimized efficiently and with guarantees. Nonetheless, recent negative results show that optimizing learned surrogates of submodular functions can result in arbitrarily bad approximations of the true optimum. Our goal in this paper is to highlight the source of this hardness, and propose an alternative criterion for optimizing general combinatorial functions from sampled data. We prove a tight equivalence showing that a class of functions is optimizable if and only if it can be learned. We provide efficient and scalable optimization algorithms for several function classes of interest, and demonstrate their utility on the task of optimally choosing trending social media items.
APA
Rosenfeld, N., Balkanski, E., Globerson, A. & Singer, Y.. (2018). Learning to Optimize Combinatorial Functions. Proceedings of the 35th International Conference on Machine Learning, in Proceedings of Machine Learning Research 80:4374-4383 Available from https://proceedings.mlr.press/v80/rosenfeld18a.html.

Related Material