Guaranteed Non-convex Optimization: Submodular Maximization over Continuous Domains

Andrew An Bian, Baharan Mirzasoleiman, Joachim Buhmann, Andreas Krause
Proceedings of the 20th International Conference on Artificial Intelligence and Statistics, PMLR 54:111-120, 2017.

Abstract

Submodular continuous functions are a category of (generally) non-convex/non-concave functions with a wide spectrum of applications. We characterize these functions and demonstrate that they can be maximized efficiently with approximation guarantees. Specifically, i) We introduce the weak DR property that gives a unified characterization of submodularity for all set, integer-lattice and continuous functions; ii) for maximizing monotone DR-submodular continuous functions under general down-closed convex constraints, we propose a Frank-Wolfe variant with (1-1/e) approximation guarantee, and sub-linear convergence rate; iii) for maximizing general non-monotone submodular continuous functions subject to box constraints, we propose a DoubleGreedy algorithm with 1/3 approximation guarantee. Submodular continuous functions naturally find applications in various real-world settings, including influence and revenue maximization with continuous assignments, sensor energy management, facility location, etc. Experimental results show that the proposed algorithms efficiently generate superior solutions compared to baseline algorithms.

Cite this Paper


BibTeX
@InProceedings{pmlr-v54-bian17a, title = {{Guaranteed Non-convex Optimization: Submodular Maximization over Continuous Domains}}, author = {Bian, Andrew An and Mirzasoleiman, Baharan and Buhmann, Joachim and Krause, Andreas}, booktitle = {Proceedings of the 20th International Conference on Artificial Intelligence and Statistics}, pages = {111--120}, year = {2017}, editor = {Singh, Aarti and Zhu, Jerry}, volume = {54}, series = {Proceedings of Machine Learning Research}, month = {20--22 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v54/bian17a/bian17a.pdf}, url = {https://proceedings.mlr.press/v54/bian17a.html}, abstract = {Submodular continuous functions are a category of (generally) non-convex/non-concave functions with a wide spectrum of applications. We characterize these functions and demonstrate that they can be maximized efficiently with approximation guarantees. Specifically, i) We introduce the weak DR property that gives a unified characterization of submodularity for all set, integer-lattice and continuous functions; ii) for maximizing monotone DR-submodular continuous functions under general down-closed convex constraints, we propose a Frank-Wolfe variant with (1-1/e) approximation guarantee, and sub-linear convergence rate; iii) for maximizing general non-monotone submodular continuous functions subject to box constraints, we propose a DoubleGreedy algorithm with 1/3 approximation guarantee. Submodular continuous functions naturally find applications in various real-world settings, including influence and revenue maximization with continuous assignments, sensor energy management, facility location, etc. Experimental results show that the proposed algorithms efficiently generate superior solutions compared to baseline algorithms.} }
Endnote
%0 Conference Paper %T Guaranteed Non-convex Optimization: Submodular Maximization over Continuous Domains %A Andrew An Bian %A Baharan Mirzasoleiman %A Joachim Buhmann %A Andreas Krause %B Proceedings of the 20th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2017 %E Aarti Singh %E Jerry Zhu %F pmlr-v54-bian17a %I PMLR %P 111--120 %U https://proceedings.mlr.press/v54/bian17a.html %V 54 %X Submodular continuous functions are a category of (generally) non-convex/non-concave functions with a wide spectrum of applications. We characterize these functions and demonstrate that they can be maximized efficiently with approximation guarantees. Specifically, i) We introduce the weak DR property that gives a unified characterization of submodularity for all set, integer-lattice and continuous functions; ii) for maximizing monotone DR-submodular continuous functions under general down-closed convex constraints, we propose a Frank-Wolfe variant with (1-1/e) approximation guarantee, and sub-linear convergence rate; iii) for maximizing general non-monotone submodular continuous functions subject to box constraints, we propose a DoubleGreedy algorithm with 1/3 approximation guarantee. Submodular continuous functions naturally find applications in various real-world settings, including influence and revenue maximization with continuous assignments, sensor energy management, facility location, etc. Experimental results show that the proposed algorithms efficiently generate superior solutions compared to baseline algorithms.
APA
Bian, A.A., Mirzasoleiman, B., Buhmann, J. & Krause, A.. (2017). Guaranteed Non-convex Optimization: Submodular Maximization over Continuous Domains. Proceedings of the 20th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 54:111-120 Available from https://proceedings.mlr.press/v54/bian17a.html.

Related Material