Fast and robust tensor decomposition with applications to dictionary learning
[edit]
Proceedings of the 2017 Conference on Learning Theory, PMLR 65:17601793, 2017.
Abstract
We develop fast spectral algorithms for tensor decomposition that match the robustness guarantees of the best known polynomialtime algorithms for this problem based on the sumofsquares (SOS) semidefinite programming hierarchy. Our algorithms can decompose a 4tensor with $n$dimensional orthonormal components in the presence of error with constant spectral norm (when viewed as an $n^2$by$n^2$ matrix). The running time is $n^5$ which is close to linear in the input size $n^4$. We also obtain algorithms with similar running time to learn sparselyused orthogonal dictionaries even when feature representations have constant relative sparsity and nonindependent coordinates. The only previous polynomialtime algorithms to solve these problem are based on solving large semidefinite programs. In contrast, our algorithms are easy to implement directly and are based on spectral projections and tensormode rearrangements. Or work is inspired by recent of Hopkins, Schramm, Shi, and Steurer (STOC’16) that shows how fast spectral algorithms can achieve the guarantees of SOS for averagecase problems. In this work, we introduce general techniques to capture the guarantees of SOS for worstcase problems.
Related Material


