Multiplying Matrices Without Multiplying

Davis Blalock, John Guttag
Proceedings of the 38th International Conference on Machine Learning, PMLR 139:992-1004, 2021.

Abstract

Multiplying matrices is among the most fundamental and most computationally demanding operations in machine learning and scientific computing. Consequently, the task of efficiently approximating matrix products has received significant attention. We introduce a learning-based algorithm for this task that greatly outperforms existing methods. Experiments using hundreds of matrices from diverse domains show that it often runs 10x faster than alternatives at a given level of error, as well as 100x faster than exact matrix multiplication. In the common case that one matrix is known ahead of time, our method also has the interesting property that it requires zero multiply-adds. These results suggest that a mixture of hashing, averaging, and byte shuffling{—}the core operations of our method{—}could be a more promising building block for machine learning than the sparsified, factorized, and/or scalar quantized matrix products that have recently been the focus of substantial research and hardware investment.

Cite this Paper


BibTeX
@InProceedings{pmlr-v139-blalock21a, title = {Multiplying Matrices Without Multiplying}, author = {Blalock, Davis and Guttag, John}, booktitle = {Proceedings of the 38th International Conference on Machine Learning}, pages = {992--1004}, year = {2021}, editor = {Meila, Marina and Zhang, Tong}, volume = {139}, series = {Proceedings of Machine Learning Research}, month = {18--24 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v139/blalock21a/blalock21a.pdf}, url = {https://proceedings.mlr.press/v139/blalock21a.html}, abstract = {Multiplying matrices is among the most fundamental and most computationally demanding operations in machine learning and scientific computing. Consequently, the task of efficiently approximating matrix products has received significant attention. We introduce a learning-based algorithm for this task that greatly outperforms existing methods. Experiments using hundreds of matrices from diverse domains show that it often runs 10x faster than alternatives at a given level of error, as well as 100x faster than exact matrix multiplication. In the common case that one matrix is known ahead of time, our method also has the interesting property that it requires zero multiply-adds. These results suggest that a mixture of hashing, averaging, and byte shuffling{—}the core operations of our method{—}could be a more promising building block for machine learning than the sparsified, factorized, and/or scalar quantized matrix products that have recently been the focus of substantial research and hardware investment.} }
Endnote
%0 Conference Paper %T Multiplying Matrices Without Multiplying %A Davis Blalock %A John Guttag %B Proceedings of the 38th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2021 %E Marina Meila %E Tong Zhang %F pmlr-v139-blalock21a %I PMLR %P 992--1004 %U https://proceedings.mlr.press/v139/blalock21a.html %V 139 %X Multiplying matrices is among the most fundamental and most computationally demanding operations in machine learning and scientific computing. Consequently, the task of efficiently approximating matrix products has received significant attention. We introduce a learning-based algorithm for this task that greatly outperforms existing methods. Experiments using hundreds of matrices from diverse domains show that it often runs 10x faster than alternatives at a given level of error, as well as 100x faster than exact matrix multiplication. In the common case that one matrix is known ahead of time, our method also has the interesting property that it requires zero multiply-adds. These results suggest that a mixture of hashing, averaging, and byte shuffling{—}the core operations of our method{—}could be a more promising building block for machine learning than the sparsified, factorized, and/or scalar quantized matrix products that have recently been the focus of substantial research and hardware investment.
APA
Blalock, D. & Guttag, J.. (2021). Multiplying Matrices Without Multiplying. Proceedings of the 38th International Conference on Machine Learning, in Proceedings of Machine Learning Research 139:992-1004 Available from https://proceedings.mlr.press/v139/blalock21a.html.

Related Material