Distributed Weighted Matching via Randomized Composable Coresets

Sepehr Assadi, Mohammadhossein Bateni, Vahab Mirrokni
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:333-343, 2019.

Abstract

Maximum weight matching is one of the most fundamental combinatorial optimization problems with a wide range of applications in data mining and bioinformatics. Developing distributed weighted matching algorithms has been challenging due to the sequential nature of efficient algorithms for this problem. In this paper, we develop a simple distributed algorithm for the problem on general graphs with approximation guarantee of 2 + eps that (nearly) matches that of the sequential greedy algorithm. A key advantage of this algorithm is that it can be easily implemented in only two rounds of computation in modern parallel computation frameworks such as MapReduce. We also demonstrate the efficiency of our algorithm in practice on various graphs (some with half a trillion edges) by achieving objective values always close to what is achievable in the centralized setting.

Cite this Paper


BibTeX
@InProceedings{pmlr-v97-assadi19a, title = {Distributed Weighted Matching via Randomized Composable Coresets}, author = {Assadi, Sepehr and Bateni, Mohammadhossein and Mirrokni, Vahab}, booktitle = {Proceedings of the 36th International Conference on Machine Learning}, pages = {333--343}, year = {2019}, editor = {Chaudhuri, Kamalika and Salakhutdinov, Ruslan}, volume = {97}, series = {Proceedings of Machine Learning Research}, month = {09--15 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v97/assadi19a/assadi19a.pdf}, url = {https://proceedings.mlr.press/v97/assadi19a.html}, abstract = {Maximum weight matching is one of the most fundamental combinatorial optimization problems with a wide range of applications in data mining and bioinformatics. Developing distributed weighted matching algorithms has been challenging due to the sequential nature of efficient algorithms for this problem. In this paper, we develop a simple distributed algorithm for the problem on general graphs with approximation guarantee of 2 + eps that (nearly) matches that of the sequential greedy algorithm. A key advantage of this algorithm is that it can be easily implemented in only two rounds of computation in modern parallel computation frameworks such as MapReduce. We also demonstrate the efficiency of our algorithm in practice on various graphs (some with half a trillion edges) by achieving objective values always close to what is achievable in the centralized setting.} }
Endnote
%0 Conference Paper %T Distributed Weighted Matching via Randomized Composable Coresets %A Sepehr Assadi %A Mohammadhossein Bateni %A Vahab Mirrokni %B Proceedings of the 36th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Ruslan Salakhutdinov %F pmlr-v97-assadi19a %I PMLR %P 333--343 %U https://proceedings.mlr.press/v97/assadi19a.html %V 97 %X Maximum weight matching is one of the most fundamental combinatorial optimization problems with a wide range of applications in data mining and bioinformatics. Developing distributed weighted matching algorithms has been challenging due to the sequential nature of efficient algorithms for this problem. In this paper, we develop a simple distributed algorithm for the problem on general graphs with approximation guarantee of 2 + eps that (nearly) matches that of the sequential greedy algorithm. A key advantage of this algorithm is that it can be easily implemented in only two rounds of computation in modern parallel computation frameworks such as MapReduce. We also demonstrate the efficiency of our algorithm in practice on various graphs (some with half a trillion edges) by achieving objective values always close to what is achievable in the centralized setting.
APA
Assadi, S., Bateni, M. & Mirrokni, V.. (2019). Distributed Weighted Matching via Randomized Composable Coresets. Proceedings of the 36th International Conference on Machine Learning, in Proceedings of Machine Learning Research 97:333-343 Available from https://proceedings.mlr.press/v97/assadi19a.html.

Related Material