Submodular Maximization beyond Nonnegativity: Guarantees, Fast Algorithms, and Applications
[edit]
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:26342643, 2019.
Abstract
It is generally believed that submodular functions–and the more general class of $\gamma$weakly submodular functions–may only be optimized under the nonnegativity assumption $f(S) \geq 0$. In this paper, we show that once the function is expressed as the difference $f = g  c$, where $g$ is monotone, nonnegative, and $\gamma$weakly submodular and $c$ is nonnegative modular, then strong approximation guarantees may be obtained. We present an algorithm for maximizing $g  c$ under a $k$cardinality constraint which produces a random feasible set $S$ such that $\mathbb{E}[g(S) c(S)] \geq (1  e^{\gamma}  \epsilon) g(\opt)  c(\opt)$, whose running time is $O (\frac{n}{\epsilon} \log^2 \frac{1}{\epsilon})$, independent of $k$. We extend these results to the unconstrained setting by describing an algorithm with the same approximation guarantees and faster $O(n \frac{1}{\epsilon} \log\frac{1}{\epsilon})$ runtime. The main techniques underlying our algorithms are twofold: the use of a surrogate objective which varies the relative importance between $g$ and $c$ throughout the algorithm, and a geometric sweep over possible $\gamma$ values. Our algorithmic guarantees are complemented by a hardness result showing that no polynomialtime algorithm which accesses $g$ through a value oracle can do better. We empirically demonstrate the success of our algorithms by applying them to experimental design on the Boston Housing dataset and directed vertex cover on the Email EU dataset.
Related Material


