A Rank1 Sketch for Matrix Multiplicative Weights
[edit]
Proceedings of the ThirtySecond Conference on Learning Theory, PMLR 99:589623, 2019.
Abstract
We show that a simple randomized sketch of the matrix multiplicative weight (MMW) update enjoys (in expectation) the same regret bounds as MMW, up to a small constant factor. Unlike MMW, where every step requires full matrix exponentiation, our steps require only a single product of the form $e^A b$, which the Lanczos method approximates efficiently. Our key technique is to view the sketch as a \emph{randomized mirror projection}, and perform mirror descent analysis on the \emph{expected projection}. Our sketch solves the online eigenvector problem, improving the best known complexity bounds by $\Omega(\log^5 n)$. We also apply this sketch to semidefinite programming in saddlepoint form, yielding a simple primaldual scheme with guarantees matching the best in the literature.
Related Material


