Computation Efficient Coded Linear Transform

Sinong Wang, Jiashang Liu, Ness Shroff, Pengyu Yang
Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics, PMLR 89:577-585, 2019.

Abstract

In large-scale distributed linear transform problems, coded computation plays an important role to reduce the delay caused by slow machines. However, existing coded schemes could end up destroying the significant sparsity that exists in large-scale machine learning problems, and in turn increase the computational delay. In this paper, we propose a coded computation strategy, referred to as diagonal code, that achieves the optimum recovery threshold and the optimum computation load. Furthermore, by leveraging the ideas from random proposal graph theory, we design a random code that achieves a constant computation load, which significantly outperforms the existing best known result. We apply our schemes to the distributed gradient descent problem and demonstrate the advantage of the approach over current fastest coded schemes.

Cite this Paper


BibTeX
@InProceedings{pmlr-v89-wang19a, title = {Computation Efficient Coded Linear Transform}, author = {Wang, Sinong and Liu, Jiashang and Shroff, Ness and Yang, Pengyu}, booktitle = {Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics}, pages = {577--585}, year = {2019}, editor = {Chaudhuri, Kamalika and Sugiyama, Masashi}, volume = {89}, series = {Proceedings of Machine Learning Research}, month = {16--18 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v89/wang19a/wang19a.pdf}, url = {https://proceedings.mlr.press/v89/wang19a.html}, abstract = {In large-scale distributed linear transform problems, coded computation plays an important role to reduce the delay caused by slow machines. However, existing coded schemes could end up destroying the significant sparsity that exists in large-scale machine learning problems, and in turn increase the computational delay. In this paper, we propose a coded computation strategy, referred to as diagonal code, that achieves the optimum recovery threshold and the optimum computation load. Furthermore, by leveraging the ideas from random proposal graph theory, we design a random code that achieves a constant computation load, which significantly outperforms the existing best known result. We apply our schemes to the distributed gradient descent problem and demonstrate the advantage of the approach over current fastest coded schemes.} }
Endnote
%0 Conference Paper %T Computation Efficient Coded Linear Transform %A Sinong Wang %A Jiashang Liu %A Ness Shroff %A Pengyu Yang %B Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Masashi Sugiyama %F pmlr-v89-wang19a %I PMLR %P 577--585 %U https://proceedings.mlr.press/v89/wang19a.html %V 89 %X In large-scale distributed linear transform problems, coded computation plays an important role to reduce the delay caused by slow machines. However, existing coded schemes could end up destroying the significant sparsity that exists in large-scale machine learning problems, and in turn increase the computational delay. In this paper, we propose a coded computation strategy, referred to as diagonal code, that achieves the optimum recovery threshold and the optimum computation load. Furthermore, by leveraging the ideas from random proposal graph theory, we design a random code that achieves a constant computation load, which significantly outperforms the existing best known result. We apply our schemes to the distributed gradient descent problem and demonstrate the advantage of the approach over current fastest coded schemes.
APA
Wang, S., Liu, J., Shroff, N. & Yang, P.. (2019). Computation Efficient Coded Linear Transform. Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 89:577-585 Available from https://proceedings.mlr.press/v89/wang19a.html.

Related Material