Dual Entangled Polynomial Code: Three-Dimensional Coding for Distributed Matrix Multiplication

Pedro Soto, Jun Li, Xiaodi Fan
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:5937-5945, 2019.

Abstract

Matrix multiplication is a fundamental building block in various machine learning algorithms. When the matrix comes from a large dataset, the multiplication can be split into multiple tasks which calculate the multiplication of submatrices on different nodes. As some nodes may be stragglers, coding schemes have been proposed to tolerate stragglers in such distributed matrix multiplication. However, existing coding schemes typically split the matrices in only one or two dimensions, limiting their capabilities to handle large-scale matrix multiplication. Three-dimensional coding, however, does not have any code construction that achieves the optimal number of tasks required for decoding, with the best result achieved by entangled polynomial (EP) codes. In this paper, we propose dual entangled polynomial (DEP) codes that require around 25% fewer tasks than EP codes by executing two matrix multiplications on each task. With experiments in a real cloud environment, we show that DEP codes can also save the decoding overhead and memory consumption of tasks.

Cite this Paper


BibTeX
@InProceedings{pmlr-v97-soto19a, title = {Dual Entangled Polynomial Code: Three-Dimensional Coding for Distributed Matrix Multiplication}, author = {Soto, Pedro and Li, Jun and Fan, Xiaodi}, booktitle = {Proceedings of the 36th International Conference on Machine Learning}, pages = {5937--5945}, year = {2019}, editor = {Chaudhuri, Kamalika and Salakhutdinov, Ruslan}, volume = {97}, series = {Proceedings of Machine Learning Research}, month = {09--15 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v97/soto19a/soto19a.pdf}, url = {https://proceedings.mlr.press/v97/soto19a.html}, abstract = {Matrix multiplication is a fundamental building block in various machine learning algorithms. When the matrix comes from a large dataset, the multiplication can be split into multiple tasks which calculate the multiplication of submatrices on different nodes. As some nodes may be stragglers, coding schemes have been proposed to tolerate stragglers in such distributed matrix multiplication. However, existing coding schemes typically split the matrices in only one or two dimensions, limiting their capabilities to handle large-scale matrix multiplication. Three-dimensional coding, however, does not have any code construction that achieves the optimal number of tasks required for decoding, with the best result achieved by entangled polynomial (EP) codes. In this paper, we propose dual entangled polynomial (DEP) codes that require around 25% fewer tasks than EP codes by executing two matrix multiplications on each task. With experiments in a real cloud environment, we show that DEP codes can also save the decoding overhead and memory consumption of tasks.} }
Endnote
%0 Conference Paper %T Dual Entangled Polynomial Code: Three-Dimensional Coding for Distributed Matrix Multiplication %A Pedro Soto %A Jun Li %A Xiaodi Fan %B Proceedings of the 36th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Ruslan Salakhutdinov %F pmlr-v97-soto19a %I PMLR %P 5937--5945 %U https://proceedings.mlr.press/v97/soto19a.html %V 97 %X Matrix multiplication is a fundamental building block in various machine learning algorithms. When the matrix comes from a large dataset, the multiplication can be split into multiple tasks which calculate the multiplication of submatrices on different nodes. As some nodes may be stragglers, coding schemes have been proposed to tolerate stragglers in such distributed matrix multiplication. However, existing coding schemes typically split the matrices in only one or two dimensions, limiting their capabilities to handle large-scale matrix multiplication. Three-dimensional coding, however, does not have any code construction that achieves the optimal number of tasks required for decoding, with the best result achieved by entangled polynomial (EP) codes. In this paper, we propose dual entangled polynomial (DEP) codes that require around 25% fewer tasks than EP codes by executing two matrix multiplications on each task. With experiments in a real cloud environment, we show that DEP codes can also save the decoding overhead and memory consumption of tasks.
APA
Soto, P., Li, J. & Fan, X.. (2019). Dual Entangled Polynomial Code: Three-Dimensional Coding for Distributed Matrix Multiplication. Proceedings of the 36th International Conference on Machine Learning, in Proceedings of Machine Learning Research 97:5937-5945 Available from https://proceedings.mlr.press/v97/soto19a.html.

Related Material