Decentralized Gradient Tracking for Continuous DR-Submodular Maximization

Jiahao Xie, Chao Zhang, Zebang Shen, Chao Mi, Hui Qian
Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics, PMLR 89:2897-2906, 2019.

Abstract

In this paper, we focus on the continuous DR-submodular maximization over a network. By using the gradient tracking technique, two decentralized algorithms are proposed for deterministic and stochastic settings, respectively. The proposed methods attain the $\epsilon$-accuracy tight approximation ratio for monotone continuous DR-submodular functions in only $O(1/\epsilon)$ and $\tilde{O}(1/\epsilon)$ rounds of communication, respectively, which are superior to the state-of-the-art. Our numerical results show that the proposed methods outperform existing decentralized methods in terms of both computation and communication complexity.

Cite this Paper


BibTeX
@InProceedings{pmlr-v89-xie19b, title = {Decentralized Gradient Tracking for Continuous DR-Submodular Maximization}, author = {Xie, Jiahao and Zhang, Chao and Shen, Zebang and Mi, Chao and Qian, Hui}, booktitle = {Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics}, pages = {2897--2906}, year = {2019}, editor = {Chaudhuri, Kamalika and Sugiyama, Masashi}, volume = {89}, series = {Proceedings of Machine Learning Research}, month = {16--18 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v89/xie19b/xie19b.pdf}, url = {https://proceedings.mlr.press/v89/xie19b.html}, abstract = {In this paper, we focus on the continuous DR-submodular maximization over a network. By using the gradient tracking technique, two decentralized algorithms are proposed for deterministic and stochastic settings, respectively. The proposed methods attain the $\epsilon$-accuracy tight approximation ratio for monotone continuous DR-submodular functions in only $O(1/\epsilon)$ and $\tilde{O}(1/\epsilon)$ rounds of communication, respectively, which are superior to the state-of-the-art. Our numerical results show that the proposed methods outperform existing decentralized methods in terms of both computation and communication complexity.} }
Endnote
%0 Conference Paper %T Decentralized Gradient Tracking for Continuous DR-Submodular Maximization %A Jiahao Xie %A Chao Zhang %A Zebang Shen %A Chao Mi %A Hui Qian %B Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Masashi Sugiyama %F pmlr-v89-xie19b %I PMLR %P 2897--2906 %U https://proceedings.mlr.press/v89/xie19b.html %V 89 %X In this paper, we focus on the continuous DR-submodular maximization over a network. By using the gradient tracking technique, two decentralized algorithms are proposed for deterministic and stochastic settings, respectively. The proposed methods attain the $\epsilon$-accuracy tight approximation ratio for monotone continuous DR-submodular functions in only $O(1/\epsilon)$ and $\tilde{O}(1/\epsilon)$ rounds of communication, respectively, which are superior to the state-of-the-art. Our numerical results show that the proposed methods outperform existing decentralized methods in terms of both computation and communication complexity.
APA
Xie, J., Zhang, C., Shen, Z., Mi, C. & Qian, H.. (2019). Decentralized Gradient Tracking for Continuous DR-Submodular Maximization. Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 89:2897-2906 Available from https://proceedings.mlr.press/v89/xie19b.html.

Related Material