Multivariate Submodular Optimization

Richard Santiago, F. Bruce Shepherd
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:5599-5609, 2019.

Abstract

Submodular functions have found a wealth of new applications in data science and machine learning models in recent years. This has been coupled with many algorithmic advances in the area of submodular optimization: (SO) $\min/\max f(S): S \in \mathcal{F}$, where $\mathcal{F}$ is a given family of feasible sets over a ground set $V$ and $f:2^V \rightarrow \mathbb{R}$ is submodular. In this work we focus on a more general class of multivariate submodular optimization (MVSO) problems: $\min/\max f (S_1,S_2,\ldots,S_k): S_1 \uplus S_2 \uplus \cdots \uplus S_k \in \mathcal{F}$. Here we use $\uplus$ to denote union of disjoint sets and hence this model is attractive where resources are being allocated across $k$ agents, who share a “joint” multivariate nonnegative objective $f(S_1,S_2,\ldots,S_k)$ that captures some type of submodularity (i.e. diminishing returns) property. We provide some explicit examples and potential applications for this new framework. For maximization, we show that practical algorithms such as accelerated greedy variants and distributed algorithms achieve good approximation guarantees for very general families (such as matroids and $p$-systems). For arbitrary families, we show that monotone (resp. nonmonotone) MVSO admits an $\alpha (1-1/e)$ (resp. $\alpha \cdot 0.385$) approximation whenever monotone (resp. nonmonotone) SO admits an $\alpha$-approximation over the multilinear formulation. This substantially expands the family of tractable models. On the minimization side we give essentially optimal approximations in terms of the curvature of $f$.

Cite this Paper


BibTeX
@InProceedings{pmlr-v97-santiago19a, title = {Multivariate Submodular Optimization}, author = {Santiago, Richard and Shepherd, F. Bruce}, booktitle = {Proceedings of the 36th International Conference on Machine Learning}, pages = {5599--5609}, year = {2019}, editor = {Chaudhuri, Kamalika and Salakhutdinov, Ruslan}, volume = {97}, series = {Proceedings of Machine Learning Research}, month = {09--15 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v97/santiago19a/santiago19a.pdf}, url = {https://proceedings.mlr.press/v97/santiago19a.html}, abstract = {Submodular functions have found a wealth of new applications in data science and machine learning models in recent years. This has been coupled with many algorithmic advances in the area of submodular optimization: (SO) $\min/\max f(S): S \in \mathcal{F}$, where $\mathcal{F}$ is a given family of feasible sets over a ground set $V$ and $f:2^V \rightarrow \mathbb{R}$ is submodular. In this work we focus on a more general class of multivariate submodular optimization (MVSO) problems: $\min/\max f (S_1,S_2,\ldots,S_k): S_1 \uplus S_2 \uplus \cdots \uplus S_k \in \mathcal{F}$. Here we use $\uplus$ to denote union of disjoint sets and hence this model is attractive where resources are being allocated across $k$ agents, who share a “joint” multivariate nonnegative objective $f(S_1,S_2,\ldots,S_k)$ that captures some type of submodularity (i.e. diminishing returns) property. We provide some explicit examples and potential applications for this new framework. For maximization, we show that practical algorithms such as accelerated greedy variants and distributed algorithms achieve good approximation guarantees for very general families (such as matroids and $p$-systems). For arbitrary families, we show that monotone (resp. nonmonotone) MVSO admits an $\alpha (1-1/e)$ (resp. $\alpha \cdot 0.385$) approximation whenever monotone (resp. nonmonotone) SO admits an $\alpha$-approximation over the multilinear formulation. This substantially expands the family of tractable models. On the minimization side we give essentially optimal approximations in terms of the curvature of $f$.} }
Endnote
%0 Conference Paper %T Multivariate Submodular Optimization %A Richard Santiago %A F. Bruce Shepherd %B Proceedings of the 36th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Ruslan Salakhutdinov %F pmlr-v97-santiago19a %I PMLR %P 5599--5609 %U https://proceedings.mlr.press/v97/santiago19a.html %V 97 %X Submodular functions have found a wealth of new applications in data science and machine learning models in recent years. This has been coupled with many algorithmic advances in the area of submodular optimization: (SO) $\min/\max f(S): S \in \mathcal{F}$, where $\mathcal{F}$ is a given family of feasible sets over a ground set $V$ and $f:2^V \rightarrow \mathbb{R}$ is submodular. In this work we focus on a more general class of multivariate submodular optimization (MVSO) problems: $\min/\max f (S_1,S_2,\ldots,S_k): S_1 \uplus S_2 \uplus \cdots \uplus S_k \in \mathcal{F}$. Here we use $\uplus$ to denote union of disjoint sets and hence this model is attractive where resources are being allocated across $k$ agents, who share a “joint” multivariate nonnegative objective $f(S_1,S_2,\ldots,S_k)$ that captures some type of submodularity (i.e. diminishing returns) property. We provide some explicit examples and potential applications for this new framework. For maximization, we show that practical algorithms such as accelerated greedy variants and distributed algorithms achieve good approximation guarantees for very general families (such as matroids and $p$-systems). For arbitrary families, we show that monotone (resp. nonmonotone) MVSO admits an $\alpha (1-1/e)$ (resp. $\alpha \cdot 0.385$) approximation whenever monotone (resp. nonmonotone) SO admits an $\alpha$-approximation over the multilinear formulation. This substantially expands the family of tractable models. On the minimization side we give essentially optimal approximations in terms of the curvature of $f$.
APA
Santiago, R. & Shepherd, F.B.. (2019). Multivariate Submodular Optimization. Proceedings of the 36th International Conference on Machine Learning, in Proceedings of Machine Learning Research 97:5599-5609 Available from https://proceedings.mlr.press/v97/santiago19a.html.

Related Material