A Rank-1 Sketch for Matrix Multiplicative Weights

Yair Carmon, John C Duchi, Sidford Aaron, Tian Kevin
Proceedings of the Thirty-Second Conference on Learning Theory, PMLR 99:589-623, 2019.

Abstract

We show that a simple randomized sketch of the matrix multiplicative weight (MMW) update enjoys (in expectation) the same regret bounds as MMW, up to a small constant factor. Unlike MMW, where every step requires full matrix exponentiation, our steps require only a single product of the form $e^A b$, which the Lanczos method approximates efficiently. Our key technique is to view the sketch as a \emph{randomized mirror projection}, and perform mirror descent analysis on the \emph{expected projection}. Our sketch solves the online eigenvector problem, improving the best known complexity bounds by $\Omega(\log^5 n)$. We also apply this sketch to semidefinite programming in saddle-point form, yielding a simple primal-dual scheme with guarantees matching the best in the literature.

Cite this Paper


BibTeX
@InProceedings{pmlr-v99-carmon19a, title = {A Rank-1 Sketch for Matrix Multiplicative Weights}, author = {Carmon, Yair and Duchi, John C and Aaron, Sidford and Kevin, Tian}, booktitle = {Proceedings of the Thirty-Second Conference on Learning Theory}, pages = {589--623}, year = {2019}, editor = {Beygelzimer, Alina and Hsu, Daniel}, volume = {99}, series = {Proceedings of Machine Learning Research}, month = {25--28 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v99/carmon19a/carmon19a.pdf}, url = {https://proceedings.mlr.press/v99/carmon19a.html}, abstract = {We show that a simple randomized sketch of the matrix multiplicative weight (MMW) update enjoys (in expectation) the same regret bounds as MMW, up to a small constant factor. Unlike MMW, where every step requires full matrix exponentiation, our steps require only a single product of the form $e^A b$, which the Lanczos method approximates efficiently. Our key technique is to view the sketch as a \emph{randomized mirror projection}, and perform mirror descent analysis on the \emph{expected projection}. Our sketch solves the online eigenvector problem, improving the best known complexity bounds by $\Omega(\log^5 n)$. We also apply this sketch to semidefinite programming in saddle-point form, yielding a simple primal-dual scheme with guarantees matching the best in the literature.} }
Endnote
%0 Conference Paper %T A Rank-1 Sketch for Matrix Multiplicative Weights %A Yair Carmon %A John C Duchi %A Sidford Aaron %A Tian Kevin %B Proceedings of the Thirty-Second Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2019 %E Alina Beygelzimer %E Daniel Hsu %F pmlr-v99-carmon19a %I PMLR %P 589--623 %U https://proceedings.mlr.press/v99/carmon19a.html %V 99 %X We show that a simple randomized sketch of the matrix multiplicative weight (MMW) update enjoys (in expectation) the same regret bounds as MMW, up to a small constant factor. Unlike MMW, where every step requires full matrix exponentiation, our steps require only a single product of the form $e^A b$, which the Lanczos method approximates efficiently. Our key technique is to view the sketch as a \emph{randomized mirror projection}, and perform mirror descent analysis on the \emph{expected projection}. Our sketch solves the online eigenvector problem, improving the best known complexity bounds by $\Omega(\log^5 n)$. We also apply this sketch to semidefinite programming in saddle-point form, yielding a simple primal-dual scheme with guarantees matching the best in the literature.
APA
Carmon, Y., Duchi, J.C., Aaron, S. & Kevin, T.. (2019). A Rank-1 Sketch for Matrix Multiplicative Weights. Proceedings of the Thirty-Second Conference on Learning Theory, in Proceedings of Machine Learning Research 99:589-623 Available from https://proceedings.mlr.press/v99/carmon19a.html.

Related Material