Scaling and Scalability: Provable Nonconvex Low-Rank Tensor Completion

Tian Tong, Cong Ma, Ashley Prater-Bennette, Erin Tripp, Yuejie Chi
Proceedings of The 25th International Conference on Artificial Intelligence and Statistics, PMLR 151:2607-2617, 2022.

Abstract

Tensors, which provide a powerful and flexible model for representing multi-attribute data and multi-way interactions, play an indispensable role in modern data science across various fields in science and engineering. A fundamental task is tensor completion, which aims to faithfully recover the tensor from a small subset of its entries in a statistically and computationally efficient manner. Harnessing the low-rank structure of tensors in the Tucker decomposition, this paper develops a scaled gradient descent (ScaledGD) algorithm to directly recover the tensor factors with tailored spectral initializations, and shows that it provably converges at a linear rate independent of the condition number of the ground truth tensor for tensor completion as soon as the sample size is above the order of $n^{3/2}$ ignoring other parameter dependencies, where $n$ is the dimension of the tensor. To the best of our knowledge, ScaledGD is the first algorithm that achieves near-optimal statistical and computational complexities simultaneously for low-rank tensor completion with the Tucker decomposition. Our algorithm highlights the power of appropriate preconditioning in accelerating nonconvex statistical estimation, where the iteration-varying preconditioners promote desirable invariance properties of the trajectory with respect to the underlying symmetry in low-rank tensor factorization.

Cite this Paper


BibTeX
@InProceedings{pmlr-v151-tong22a, title = { Scaling and Scalability: Provable Nonconvex Low-Rank Tensor Completion }, author = {Tong, Tian and Ma, Cong and Prater-Bennette, Ashley and Tripp, Erin and Chi, Yuejie}, booktitle = {Proceedings of The 25th International Conference on Artificial Intelligence and Statistics}, pages = {2607--2617}, year = {2022}, editor = {Camps-Valls, Gustau and Ruiz, Francisco J. R. and Valera, Isabel}, volume = {151}, series = {Proceedings of Machine Learning Research}, month = {28--30 Mar}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v151/tong22a/tong22a.pdf}, url = {https://proceedings.mlr.press/v151/tong22a.html}, abstract = { Tensors, which provide a powerful and flexible model for representing multi-attribute data and multi-way interactions, play an indispensable role in modern data science across various fields in science and engineering. A fundamental task is tensor completion, which aims to faithfully recover the tensor from a small subset of its entries in a statistically and computationally efficient manner. Harnessing the low-rank structure of tensors in the Tucker decomposition, this paper develops a scaled gradient descent (ScaledGD) algorithm to directly recover the tensor factors with tailored spectral initializations, and shows that it provably converges at a linear rate independent of the condition number of the ground truth tensor for tensor completion as soon as the sample size is above the order of $n^{3/2}$ ignoring other parameter dependencies, where $n$ is the dimension of the tensor. To the best of our knowledge, ScaledGD is the first algorithm that achieves near-optimal statistical and computational complexities simultaneously for low-rank tensor completion with the Tucker decomposition. Our algorithm highlights the power of appropriate preconditioning in accelerating nonconvex statistical estimation, where the iteration-varying preconditioners promote desirable invariance properties of the trajectory with respect to the underlying symmetry in low-rank tensor factorization. } }
Endnote
%0 Conference Paper %T Scaling and Scalability: Provable Nonconvex Low-Rank Tensor Completion %A Tian Tong %A Cong Ma %A Ashley Prater-Bennette %A Erin Tripp %A Yuejie Chi %B Proceedings of The 25th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2022 %E Gustau Camps-Valls %E Francisco J. R. Ruiz %E Isabel Valera %F pmlr-v151-tong22a %I PMLR %P 2607--2617 %U https://proceedings.mlr.press/v151/tong22a.html %V 151 %X Tensors, which provide a powerful and flexible model for representing multi-attribute data and multi-way interactions, play an indispensable role in modern data science across various fields in science and engineering. A fundamental task is tensor completion, which aims to faithfully recover the tensor from a small subset of its entries in a statistically and computationally efficient manner. Harnessing the low-rank structure of tensors in the Tucker decomposition, this paper develops a scaled gradient descent (ScaledGD) algorithm to directly recover the tensor factors with tailored spectral initializations, and shows that it provably converges at a linear rate independent of the condition number of the ground truth tensor for tensor completion as soon as the sample size is above the order of $n^{3/2}$ ignoring other parameter dependencies, where $n$ is the dimension of the tensor. To the best of our knowledge, ScaledGD is the first algorithm that achieves near-optimal statistical and computational complexities simultaneously for low-rank tensor completion with the Tucker decomposition. Our algorithm highlights the power of appropriate preconditioning in accelerating nonconvex statistical estimation, where the iteration-varying preconditioners promote desirable invariance properties of the trajectory with respect to the underlying symmetry in low-rank tensor factorization.
APA
Tong, T., Ma, C., Prater-Bennette, A., Tripp, E. & Chi, Y.. (2022). Scaling and Scalability: Provable Nonconvex Low-Rank Tensor Completion . Proceedings of The 25th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 151:2607-2617 Available from https://proceedings.mlr.press/v151/tong22a.html.

Related Material