Sample complexity bounds for localized sketching

Rakshith Sharma Srinivasa, Mark Davenport, Justin Romberg
Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, PMLR 108:3275-3284, 2020.

Abstract

We consider sketched approximate matrix multiplication and ridge regression in the novel setting of localized sketching, where at any given point, only part of the data matrix is available. This corresponds to a block diagonal structure on the sketching matrix. We show that, under mild conditions, block diagonal sketching matrices require only $O(\sr / \epsilon^2)$ and $O(\sd_{\lambda}/\epsilon)$ total sample complexity for matrix multiplication and ridge regression, respectively. This matches the state-of-the-art bounds that are obtained using global sketching matrices. The localized nature of sketching considered allows for different parts of the data matrix to be sketched independently and hence is more amenable to computation in distributed and streaming settings and results in a smaller memory and computational footprint.

Cite this Paper


BibTeX
@InProceedings{pmlr-v108-srinivasa20a, title = {Sample complexity bounds for localized sketching}, author = {Srinivasa, Rakshith Sharma and Davenport, Mark and Romberg, Justin}, booktitle = {Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics}, pages = {3275--3284}, year = {2020}, editor = {Chiappa, Silvia and Calandra, Roberto}, volume = {108}, series = {Proceedings of Machine Learning Research}, month = {26--28 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v108/srinivasa20a/srinivasa20a.pdf}, url = {https://proceedings.mlr.press/v108/srinivasa20a.html}, abstract = { We consider sketched approximate matrix multiplication and ridge regression in the novel setting of localized sketching, where at any given point, only part of the data matrix is available. This corresponds to a block diagonal structure on the sketching matrix. We show that, under mild conditions, block diagonal sketching matrices require only $O(\sr / \epsilon^2)$ and $O(\sd_{\lambda}/\epsilon)$ total sample complexity for matrix multiplication and ridge regression, respectively. This matches the state-of-the-art bounds that are obtained using global sketching matrices. The localized nature of sketching considered allows for different parts of the data matrix to be sketched independently and hence is more amenable to computation in distributed and streaming settings and results in a smaller memory and computational footprint.} }
Endnote
%0 Conference Paper %T Sample complexity bounds for localized sketching %A Rakshith Sharma Srinivasa %A Mark Davenport %A Justin Romberg %B Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2020 %E Silvia Chiappa %E Roberto Calandra %F pmlr-v108-srinivasa20a %I PMLR %P 3275--3284 %U https://proceedings.mlr.press/v108/srinivasa20a.html %V 108 %X We consider sketched approximate matrix multiplication and ridge regression in the novel setting of localized sketching, where at any given point, only part of the data matrix is available. This corresponds to a block diagonal structure on the sketching matrix. We show that, under mild conditions, block diagonal sketching matrices require only $O(\sr / \epsilon^2)$ and $O(\sd_{\lambda}/\epsilon)$ total sample complexity for matrix multiplication and ridge regression, respectively. This matches the state-of-the-art bounds that are obtained using global sketching matrices. The localized nature of sketching considered allows for different parts of the data matrix to be sketched independently and hence is more amenable to computation in distributed and streaming settings and results in a smaller memory and computational footprint.
APA
Srinivasa, R.S., Davenport, M. & Romberg, J.. (2020). Sample complexity bounds for localized sketching. Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 108:3275-3284 Available from https://proceedings.mlr.press/v108/srinivasa20a.html.

Related Material