Optimal approximation for unconstrained non-submodular minimization

Marwa El Halabi, Stefanie Jegelka
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:3961-3972, 2020.

Abstract

Submodular function minimization is well studied, and existing algorithms solve it exactly or up to arbitrary accuracy. However, in many applications, such as structured sparse learning or batch Bayesian optimization, the objective function is not exactly submodular, but close. In this case, no theoretical guarantees exist. Indeed, submodular minimization algorithms rely on intricate connections between submodularity and convexity. We show how these relations can be extended to obtain approximation guarantees for minimizing non-submodular functions, characterized by how close the function is to submodular. We also extend this result to noisy function evaluations. Our approximation results are the first for minimizing non-submodular functions, and are optimal, as established by our matching lower bound.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-halabi20a, title = {Optimal approximation for unconstrained non-submodular minimization}, author = {Halabi, Marwa El and Jegelka, Stefanie}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {3961--3972}, year = {2020}, editor = {III, Hal Daumé and Singh, Aarti}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/halabi20a/halabi20a.pdf}, url = {https://proceedings.mlr.press/v119/halabi20a.html}, abstract = {Submodular function minimization is well studied, and existing algorithms solve it exactly or up to arbitrary accuracy. However, in many applications, such as structured sparse learning or batch Bayesian optimization, the objective function is not exactly submodular, but close. In this case, no theoretical guarantees exist. Indeed, submodular minimization algorithms rely on intricate connections between submodularity and convexity. We show how these relations can be extended to obtain approximation guarantees for minimizing non-submodular functions, characterized by how close the function is to submodular. We also extend this result to noisy function evaluations. Our approximation results are the first for minimizing non-submodular functions, and are optimal, as established by our matching lower bound.} }
Endnote
%0 Conference Paper %T Optimal approximation for unconstrained non-submodular minimization %A Marwa El Halabi %A Stefanie Jegelka %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-halabi20a %I PMLR %P 3961--3972 %U https://proceedings.mlr.press/v119/halabi20a.html %V 119 %X Submodular function minimization is well studied, and existing algorithms solve it exactly or up to arbitrary accuracy. However, in many applications, such as structured sparse learning or batch Bayesian optimization, the objective function is not exactly submodular, but close. In this case, no theoretical guarantees exist. Indeed, submodular minimization algorithms rely on intricate connections between submodularity and convexity. We show how these relations can be extended to obtain approximation guarantees for minimizing non-submodular functions, characterized by how close the function is to submodular. We also extend this result to noisy function evaluations. Our approximation results are the first for minimizing non-submodular functions, and are optimal, as established by our matching lower bound.
APA
Halabi, M.E. & Jegelka, S.. (2020). Optimal approximation for unconstrained non-submodular minimization. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:3961-3972 Available from https://proceedings.mlr.press/v119/halabi20a.html.

Related Material