Breaking Barriers: Combinatorial Algorithms for Non-Monotone Submodular Maximization with Sublinear Adaptivity and $1/e$ Approximation

Yixin Chen, Wenjing Chen, Alan Kuhnle
Proceedings of the 42nd International Conference on Machine Learning, PMLR 267:7840-7879, 2025.

Abstract

With the rapid growth of data in modern applications, parallel combinatorial algorithms for maximizing non-monotone submodular functions have gained significant attention. In the parallel computation setting, the state-of-the-art approximation ratio of $1/e$ is achieved by a continuous algorithm (Ene & Nguyen, 2020) with adaptivity $\mathcal O (log(n))$. In this work, we focus on size constraints and present the first combinatorial algorithm matching this bound – a randomized parallel approach achieving $1/e - \epsilon$ approximation ratio. This result bridges the gap between continuous and combinatorial approaches for this problem. As a byproduct, we also develop a simpler $(1/4 - \epsilon)$-approximation algorithm with high probability $(\ge 1 - 1/n)$. Both algorithms achieve $\mathcal O (log(n) log(k))$ adaptivity and $\mathcal O (n log(n) log(k)) query complexity. Empirical results show our algorithms achieve competitive objective values, with the $(1/4 - $\epsilon$)$-approximation algorithm particularly efficient in queries.

Cite this Paper


BibTeX
@InProceedings{pmlr-v267-chen25i, title = {Breaking Barriers: Combinatorial Algorithms for Non-Monotone Submodular Maximization with Sublinear Adaptivity and $1/e$ Approximation}, author = {Chen, Yixin and Chen, Wenjing and Kuhnle, Alan}, booktitle = {Proceedings of the 42nd International Conference on Machine Learning}, pages = {7840--7879}, year = {2025}, editor = {Singh, Aarti and Fazel, Maryam and Hsu, Daniel and Lacoste-Julien, Simon and Berkenkamp, Felix and Maharaj, Tegan and Wagstaff, Kiri and Zhu, Jerry}, volume = {267}, series = {Proceedings of Machine Learning Research}, month = {13--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v267/main/assets/chen25i/chen25i.pdf}, url = {https://proceedings.mlr.press/v267/chen25i.html}, abstract = {With the rapid growth of data in modern applications, parallel combinatorial algorithms for maximizing non-monotone submodular functions have gained significant attention. In the parallel computation setting, the state-of-the-art approximation ratio of $1/e$ is achieved by a continuous algorithm (Ene & Nguyen, 2020) with adaptivity $\mathcal O (log(n))$. In this work, we focus on size constraints and present the first combinatorial algorithm matching this bound – a randomized parallel approach achieving $1/e - \epsilon$ approximation ratio. This result bridges the gap between continuous and combinatorial approaches for this problem. As a byproduct, we also develop a simpler $(1/4 - \epsilon)$-approximation algorithm with high probability $(\ge 1 - 1/n)$. Both algorithms achieve $\mathcal O (log(n) log(k))$ adaptivity and $\mathcal O (n log(n) log(k)) query complexity. Empirical results show our algorithms achieve competitive objective values, with the $(1/4 - $\epsilon$)$-approximation algorithm particularly efficient in queries.} }
Endnote
%0 Conference Paper %T Breaking Barriers: Combinatorial Algorithms for Non-Monotone Submodular Maximization with Sublinear Adaptivity and $1/e$ Approximation %A Yixin Chen %A Wenjing Chen %A Alan Kuhnle %B Proceedings of the 42nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Aarti Singh %E Maryam Fazel %E Daniel Hsu %E Simon Lacoste-Julien %E Felix Berkenkamp %E Tegan Maharaj %E Kiri Wagstaff %E Jerry Zhu %F pmlr-v267-chen25i %I PMLR %P 7840--7879 %U https://proceedings.mlr.press/v267/chen25i.html %V 267 %X With the rapid growth of data in modern applications, parallel combinatorial algorithms for maximizing non-monotone submodular functions have gained significant attention. In the parallel computation setting, the state-of-the-art approximation ratio of $1/e$ is achieved by a continuous algorithm (Ene & Nguyen, 2020) with adaptivity $\mathcal O (log(n))$. In this work, we focus on size constraints and present the first combinatorial algorithm matching this bound – a randomized parallel approach achieving $1/e - \epsilon$ approximation ratio. This result bridges the gap between continuous and combinatorial approaches for this problem. As a byproduct, we also develop a simpler $(1/4 - \epsilon)$-approximation algorithm with high probability $(\ge 1 - 1/n)$. Both algorithms achieve $\mathcal O (log(n) log(k))$ adaptivity and $\mathcal O (n log(n) log(k)) query complexity. Empirical results show our algorithms achieve competitive objective values, with the $(1/4 - $\epsilon$)$-approximation algorithm particularly efficient in queries.
APA
Chen, Y., Chen, W. & Kuhnle, A.. (2025). Breaking Barriers: Combinatorial Algorithms for Non-Monotone Submodular Maximization with Sublinear Adaptivity and $1/e$ Approximation. Proceedings of the 42nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 267:7840-7879 Available from https://proceedings.mlr.press/v267/chen25i.html.

Related Material