Non-monotone Submodular Maximization with Nearly Optimal Adaptivity and Query Complexity

Matthew Fahrbach, Vahab Mirrokni, Morteza Zadimoghaddam
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:1833-1842, 2019.

Abstract

Submodular maximization is a general optimization problem with a wide range of applications in machine learning (e.g., active learning, clustering, and feature selection). In large-scale optimization, the parallel running time of an algorithm is governed by its adaptivity, which measures the number of sequential rounds needed if the algorithm can execute polynomially-many independent oracle queries in parallel. While low adaptivity is ideal, it is not sufficient for an algorithm to be efficient in practice—there are many applications of distributed submodular optimization where the number of function evaluations becomes prohibitively expensive. Motivated by these applications, we study the adaptivity and query complexity of submodular maximization. In this paper, we give the first constant-factor approximation algorithm for maximizing a non-monotone submodular function subject to a cardinality constraint $k$ that runs in $O(\log(n))$ adaptive rounds and makes $O(n \log(k))$ oracle queries in expectation. In our empirical study, we use three real-world applications to compare our algorithm with several benchmarks for non-monotone submodular maximization. The results demonstrate that our algorithm finds competitive solutions using significantly fewer rounds and queries.

Cite this Paper


BibTeX
@InProceedings{pmlr-v97-fahrbach19a, title = {Non-monotone Submodular Maximization with Nearly Optimal Adaptivity and Query Complexity}, author = {Fahrbach, Matthew and Mirrokni, Vahab and Zadimoghaddam, Morteza}, booktitle = {Proceedings of the 36th International Conference on Machine Learning}, pages = {1833--1842}, year = {2019}, editor = {Chaudhuri, Kamalika and Salakhutdinov, Ruslan}, volume = {97}, series = {Proceedings of Machine Learning Research}, month = {09--15 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v97/fahrbach19a/fahrbach19a.pdf}, url = {https://proceedings.mlr.press/v97/fahrbach19a.html}, abstract = {Submodular maximization is a general optimization problem with a wide range of applications in machine learning (e.g., active learning, clustering, and feature selection). In large-scale optimization, the parallel running time of an algorithm is governed by its adaptivity, which measures the number of sequential rounds needed if the algorithm can execute polynomially-many independent oracle queries in parallel. While low adaptivity is ideal, it is not sufficient for an algorithm to be efficient in practice—there are many applications of distributed submodular optimization where the number of function evaluations becomes prohibitively expensive. Motivated by these applications, we study the adaptivity and query complexity of submodular maximization. In this paper, we give the first constant-factor approximation algorithm for maximizing a non-monotone submodular function subject to a cardinality constraint $k$ that runs in $O(\log(n))$ adaptive rounds and makes $O(n \log(k))$ oracle queries in expectation. In our empirical study, we use three real-world applications to compare our algorithm with several benchmarks for non-monotone submodular maximization. The results demonstrate that our algorithm finds competitive solutions using significantly fewer rounds and queries.} }
Endnote
%0 Conference Paper %T Non-monotone Submodular Maximization with Nearly Optimal Adaptivity and Query Complexity %A Matthew Fahrbach %A Vahab Mirrokni %A Morteza Zadimoghaddam %B Proceedings of the 36th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Ruslan Salakhutdinov %F pmlr-v97-fahrbach19a %I PMLR %P 1833--1842 %U https://proceedings.mlr.press/v97/fahrbach19a.html %V 97 %X Submodular maximization is a general optimization problem with a wide range of applications in machine learning (e.g., active learning, clustering, and feature selection). In large-scale optimization, the parallel running time of an algorithm is governed by its adaptivity, which measures the number of sequential rounds needed if the algorithm can execute polynomially-many independent oracle queries in parallel. While low adaptivity is ideal, it is not sufficient for an algorithm to be efficient in practice—there are many applications of distributed submodular optimization where the number of function evaluations becomes prohibitively expensive. Motivated by these applications, we study the adaptivity and query complexity of submodular maximization. In this paper, we give the first constant-factor approximation algorithm for maximizing a non-monotone submodular function subject to a cardinality constraint $k$ that runs in $O(\log(n))$ adaptive rounds and makes $O(n \log(k))$ oracle queries in expectation. In our empirical study, we use three real-world applications to compare our algorithm with several benchmarks for non-monotone submodular maximization. The results demonstrate that our algorithm finds competitive solutions using significantly fewer rounds and queries.
APA
Fahrbach, M., Mirrokni, V. & Zadimoghaddam, M.. (2019). Non-monotone Submodular Maximization with Nearly Optimal Adaptivity and Query Complexity. Proceedings of the 36th International Conference on Machine Learning, in Proceedings of Machine Learning Research 97:1833-1842 Available from https://proceedings.mlr.press/v97/fahrbach19a.html.

Related Material