Noisy Tensor Completion via the Sum-of-Squares Hierarchy

Boaz Barak, Ankur Moitra
29th Annual Conference on Learning Theory, PMLR 49:417-445, 2016.

Abstract

In the noisy tensor completion problem we observe m entries (whose location is chosen uniformly at random) from an unknown n_1 \times n_2 \times n_3 tensor T. We assume that T is entry-wise close to being rank r. Our goal is to fill in its missing entries using as few observations as possible. Let n = \max(n_1, n_2, n_3). We show that if m = n^3/2 r then there is a polynomial time algorithm based on the sixth level of the sum-of-squares hierarchy for completing it. Our estimate agrees with almost all of T’s entries almost exactly and works even when our observations are corrupted by noise. This is also the first algorithm for tensor completion that works in the overcomplete case when r > n, and in fact it works all the way up to r = n^3/2-ε. Our proofs are short and simple and are based on establishing a new connection between noisy tensor completion (through the language of Rademacher complexity) and the task of refuting random constant satisfaction problems. This connection seems to have gone unnoticed even in the context of matrix completion. Furthermore, we use this connection to show matching lower bounds. Our main technical result is in characterizing the Rademacher complexity of the sequence of norms that arise in the sum-of-squares relaxations to the tensor nuclear norm. These results point to an interesting new direction: Can we explore computational vs. sample complexity tradeoffs through the sum-of-squares hierarchy?

Cite this Paper


BibTeX
@InProceedings{pmlr-v49-barak16, title = {Noisy Tensor Completion via the Sum-of-Squares Hierarchy}, author = {Barak, Boaz and Moitra, Ankur}, booktitle = {29th Annual Conference on Learning Theory}, pages = {417--445}, year = {2016}, editor = {Feldman, Vitaly and Rakhlin, Alexander and Shamir, Ohad}, volume = {49}, series = {Proceedings of Machine Learning Research}, address = {Columbia University, New York, New York, USA}, month = {23--26 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v49/barak16.pdf}, url = {https://proceedings.mlr.press/v49/barak16.html}, abstract = {In the noisy tensor completion problem we observe m entries (whose location is chosen uniformly at random) from an unknown n_1 \times n_2 \times n_3 tensor T. We assume that T is entry-wise close to being rank r. Our goal is to fill in its missing entries using as few observations as possible. Let n = \max(n_1, n_2, n_3). We show that if m = n^3/2 r then there is a polynomial time algorithm based on the sixth level of the sum-of-squares hierarchy for completing it. Our estimate agrees with almost all of T’s entries almost exactly and works even when our observations are corrupted by noise. This is also the first algorithm for tensor completion that works in the overcomplete case when r > n, and in fact it works all the way up to r = n^3/2-ε. Our proofs are short and simple and are based on establishing a new connection between noisy tensor completion (through the language of Rademacher complexity) and the task of refuting random constant satisfaction problems. This connection seems to have gone unnoticed even in the context of matrix completion. Furthermore, we use this connection to show matching lower bounds. Our main technical result is in characterizing the Rademacher complexity of the sequence of norms that arise in the sum-of-squares relaxations to the tensor nuclear norm. These results point to an interesting new direction: Can we explore computational vs. sample complexity tradeoffs through the sum-of-squares hierarchy?} }
Endnote
%0 Conference Paper %T Noisy Tensor Completion via the Sum-of-Squares Hierarchy %A Boaz Barak %A Ankur Moitra %B 29th Annual Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2016 %E Vitaly Feldman %E Alexander Rakhlin %E Ohad Shamir %F pmlr-v49-barak16 %I PMLR %P 417--445 %U https://proceedings.mlr.press/v49/barak16.html %V 49 %X In the noisy tensor completion problem we observe m entries (whose location is chosen uniformly at random) from an unknown n_1 \times n_2 \times n_3 tensor T. We assume that T is entry-wise close to being rank r. Our goal is to fill in its missing entries using as few observations as possible. Let n = \max(n_1, n_2, n_3). We show that if m = n^3/2 r then there is a polynomial time algorithm based on the sixth level of the sum-of-squares hierarchy for completing it. Our estimate agrees with almost all of T’s entries almost exactly and works even when our observations are corrupted by noise. This is also the first algorithm for tensor completion that works in the overcomplete case when r > n, and in fact it works all the way up to r = n^3/2-ε. Our proofs are short and simple and are based on establishing a new connection between noisy tensor completion (through the language of Rademacher complexity) and the task of refuting random constant satisfaction problems. This connection seems to have gone unnoticed even in the context of matrix completion. Furthermore, we use this connection to show matching lower bounds. Our main technical result is in characterizing the Rademacher complexity of the sequence of norms that arise in the sum-of-squares relaxations to the tensor nuclear norm. These results point to an interesting new direction: Can we explore computational vs. sample complexity tradeoffs through the sum-of-squares hierarchy?
RIS
TY - CPAPER TI - Noisy Tensor Completion via the Sum-of-Squares Hierarchy AU - Boaz Barak AU - Ankur Moitra BT - 29th Annual Conference on Learning Theory DA - 2016/06/06 ED - Vitaly Feldman ED - Alexander Rakhlin ED - Ohad Shamir ID - pmlr-v49-barak16 PB - PMLR DP - Proceedings of Machine Learning Research VL - 49 SP - 417 EP - 445 L1 - http://proceedings.mlr.press/v49/barak16.pdf UR - https://proceedings.mlr.press/v49/barak16.html AB - In the noisy tensor completion problem we observe m entries (whose location is chosen uniformly at random) from an unknown n_1 \times n_2 \times n_3 tensor T. We assume that T is entry-wise close to being rank r. Our goal is to fill in its missing entries using as few observations as possible. Let n = \max(n_1, n_2, n_3). We show that if m = n^3/2 r then there is a polynomial time algorithm based on the sixth level of the sum-of-squares hierarchy for completing it. Our estimate agrees with almost all of T’s entries almost exactly and works even when our observations are corrupted by noise. This is also the first algorithm for tensor completion that works in the overcomplete case when r > n, and in fact it works all the way up to r = n^3/2-ε. Our proofs are short and simple and are based on establishing a new connection between noisy tensor completion (through the language of Rademacher complexity) and the task of refuting random constant satisfaction problems. This connection seems to have gone unnoticed even in the context of matrix completion. Furthermore, we use this connection to show matching lower bounds. Our main technical result is in characterizing the Rademacher complexity of the sequence of norms that arise in the sum-of-squares relaxations to the tensor nuclear norm. These results point to an interesting new direction: Can we explore computational vs. sample complexity tradeoffs through the sum-of-squares hierarchy? ER -
APA
Barak, B. & Moitra, A.. (2016). Noisy Tensor Completion via the Sum-of-Squares Hierarchy. 29th Annual Conference on Learning Theory, in Proceedings of Machine Learning Research 49:417-445 Available from https://proceedings.mlr.press/v49/barak16.html.

Related Material