Submodular Optimization under Noise

Avinatan Hassidim, Yaron Singer
Proceedings of the 2017 Conference on Learning Theory, PMLR 65:1069-1122, 2017.

Abstract

We consider the problem of maximizing a monotone submodular function under noise. Since the 1970s there has been a great deal of work on optimization of submodular functions under various constraints, resulting in algorithms that provide desirable approximation guarantees. In many applications, however, we do not have access to the submodular function we aim to optimize, but rather to some erroneous or noisy version of it. This raises the question of whether provable guarantees are obtainable in the presence of error and noise. We provide initial answers by focusing on the problem of maximizing a monotone submodular function under a cardinality constraint when given access to a noisy oracle of the function. We show that there is an algorithm whose approximation ratio is arbitrarily close to the optimal $1-1/e$ when the cardinality is sufficiently large. The algorithm can be applied in a variety of related problems including maximizing approximately submodular functions, and optimization with correlated noise. When the noise is adversarial we show that no non-trivial approximation guarantee can be obtained.

Cite this Paper


BibTeX
@InProceedings{pmlr-v65-hassidim17a, title = {Submodular Optimization under Noise}, author = {Hassidim, Avinatan and Singer, Yaron}, booktitle = {Proceedings of the 2017 Conference on Learning Theory}, pages = {1069--1122}, year = {2017}, editor = {Kale, Satyen and Shamir, Ohad}, volume = {65}, series = {Proceedings of Machine Learning Research}, month = {07--10 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v65/hassidim17a/hassidim17a.pdf}, url = {https://proceedings.mlr.press/v65/hassidim17a.html}, abstract = {We consider the problem of maximizing a monotone submodular function under noise. Since the 1970s there has been a great deal of work on optimization of submodular functions under various constraints, resulting in algorithms that provide desirable approximation guarantees. In many applications, however, we do not have access to the submodular function we aim to optimize, but rather to some erroneous or noisy version of it. This raises the question of whether provable guarantees are obtainable in the presence of error and noise. We provide initial answers by focusing on the problem of maximizing a monotone submodular function under a cardinality constraint when given access to a noisy oracle of the function. We show that there is an algorithm whose approximation ratio is arbitrarily close to the optimal $1-1/e$ when the cardinality is sufficiently large. The algorithm can be applied in a variety of related problems including maximizing approximately submodular functions, and optimization with correlated noise. When the noise is adversarial we show that no non-trivial approximation guarantee can be obtained.} }
Endnote
%0 Conference Paper %T Submodular Optimization under Noise %A Avinatan Hassidim %A Yaron Singer %B Proceedings of the 2017 Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2017 %E Satyen Kale %E Ohad Shamir %F pmlr-v65-hassidim17a %I PMLR %P 1069--1122 %U https://proceedings.mlr.press/v65/hassidim17a.html %V 65 %X We consider the problem of maximizing a monotone submodular function under noise. Since the 1970s there has been a great deal of work on optimization of submodular functions under various constraints, resulting in algorithms that provide desirable approximation guarantees. In many applications, however, we do not have access to the submodular function we aim to optimize, but rather to some erroneous or noisy version of it. This raises the question of whether provable guarantees are obtainable in the presence of error and noise. We provide initial answers by focusing on the problem of maximizing a monotone submodular function under a cardinality constraint when given access to a noisy oracle of the function. We show that there is an algorithm whose approximation ratio is arbitrarily close to the optimal $1-1/e$ when the cardinality is sufficiently large. The algorithm can be applied in a variety of related problems including maximizing approximately submodular functions, and optimization with correlated noise. When the noise is adversarial we show that no non-trivial approximation guarantee can be obtained.
APA
Hassidim, A. & Singer, Y.. (2017). Submodular Optimization under Noise. Proceedings of the 2017 Conference on Learning Theory, in Proceedings of Machine Learning Research 65:1069-1122 Available from https://proceedings.mlr.press/v65/hassidim17a.html.

Related Material