Orthogonalized ALS: A Theoretically Principled Tensor Decomposition Algorithm for Practical Use

Vatsal Sharan, Gregory Valiant
Proceedings of the 34th International Conference on Machine Learning, PMLR 70:3095-3104, 2017.

Abstract

The popular Alternating Least Squares (ALS) algorithm for tensor decomposition is efficient and easy to implement, but often converges to poor local optima—particularly when the weights of the factors are non-uniform. We propose a modification of the ALS approach that is as efficient as standard ALS, but provably recovers the true factors with random initialization under standard incoherence assumptions on the factors of the tensor. We demonstrate the significant practical superiority of our approach over traditional ALS for a variety of tasks on synthetic data—including tensor factorization on exact, noisy and over-complete tensors, as well as tensor completion—and for computing word embeddings from a third-order word tri-occurrence tensor.

Cite this Paper


BibTeX
@InProceedings{pmlr-v70-sharan17a, title = {Orthogonalized {ALS}: A Theoretically Principled Tensor Decomposition Algorithm for Practical Use}, author = {Vatsal Sharan and Gregory Valiant}, booktitle = {Proceedings of the 34th International Conference on Machine Learning}, pages = {3095--3104}, year = {2017}, editor = {Precup, Doina and Teh, Yee Whye}, volume = {70}, series = {Proceedings of Machine Learning Research}, month = {06--11 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v70/sharan17a/sharan17a.pdf}, url = {https://proceedings.mlr.press/v70/sharan17a.html}, abstract = {The popular Alternating Least Squares (ALS) algorithm for tensor decomposition is efficient and easy to implement, but often converges to poor local optima—particularly when the weights of the factors are non-uniform. We propose a modification of the ALS approach that is as efficient as standard ALS, but provably recovers the true factors with random initialization under standard incoherence assumptions on the factors of the tensor. We demonstrate the significant practical superiority of our approach over traditional ALS for a variety of tasks on synthetic data—including tensor factorization on exact, noisy and over-complete tensors, as well as tensor completion—and for computing word embeddings from a third-order word tri-occurrence tensor.} }
Endnote
%0 Conference Paper %T Orthogonalized ALS: A Theoretically Principled Tensor Decomposition Algorithm for Practical Use %A Vatsal Sharan %A Gregory Valiant %B Proceedings of the 34th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2017 %E Doina Precup %E Yee Whye Teh %F pmlr-v70-sharan17a %I PMLR %P 3095--3104 %U https://proceedings.mlr.press/v70/sharan17a.html %V 70 %X The popular Alternating Least Squares (ALS) algorithm for tensor decomposition is efficient and easy to implement, but often converges to poor local optima—particularly when the weights of the factors are non-uniform. We propose a modification of the ALS approach that is as efficient as standard ALS, but provably recovers the true factors with random initialization under standard incoherence assumptions on the factors of the tensor. We demonstrate the significant practical superiority of our approach over traditional ALS for a variety of tasks on synthetic data—including tensor factorization on exact, noisy and over-complete tensors, as well as tensor completion—and for computing word embeddings from a third-order word tri-occurrence tensor.
APA
Sharan, V. & Valiant, G.. (2017). Orthogonalized ALS: A Theoretically Principled Tensor Decomposition Algorithm for Practical Use. Proceedings of the 34th International Conference on Machine Learning, in Proceedings of Machine Learning Research 70:3095-3104 Available from https://proceedings.mlr.press/v70/sharan17a.html.

Related Material