Submodular Optimization under Noise
Proceedings of the 2017 Conference on Learning Theory, PMLR 65:1069-1122, 2017.
We consider the problem of maximizing a monotone submodular function under noise. Since the 1970s there has been a great deal of work on optimization of submodular functions under various constraints, resulting in algorithms that provide desirable approximation guarantees. In many applications, however, we do not have access to the submodular function we aim to optimize, but rather to some erroneous or noisy version of it. This raises the question of whether provable guarantees are obtainable in the presence of error and noise. We provide initial answers by focusing on the problem of maximizing a monotone submodular function under a cardinality constraint when given access to a noisy oracle of the function. We show that there is an algorithm whose approximation ratio is arbitrarily close to the optimal $1-1/e$ when the cardinality is sufficiently large. The algorithm can be applied in a variety of related problems including maximizing approximately submodular functions, and optimization with correlated noise. When the noise is adversarial we show that no non-trivial approximation guarantee can be obtained.