Square Deal: Lower Bounds and Improved Relaxations for Tensor Recovery

Cun Mu, Bo Huang, John Wright, Donald Goldfarb
Proceedings of the 31st International Conference on Machine Learning, PMLR 32(2):73-81, 2014.

Abstract

Recovering a low-rank tensor from incomplete information is a recurring problem in signal processing and machine learning. The most popular convex relaxation of this problem minimizes the sum of the nuclear norms (SNN) of the unfolding matrices of the tensor. We show that this approach can be substantially suboptimal: reliably recovering a K-way n\timesn\times⋯\times n tensor of Tucker rank (r, r, \ldots, r) from Gaussian measurements requires Ω( r n^K-1 ) observations. In contrast, a certain (intractable) nonconvex formulation needs only O(r^K + nrK) observations. We introduce a simple, new convex relaxation, which partially bridges this gap. Our new formulation succeeds with O(r^⌊K/2 ⌋n^⌈K/2 ⌉) observations. The lower bound for the SNN model follows from our new result on recovering signals with multiple structures (e.g. sparse, low rank), which indicates the significant suboptimality of the common approach of minimizing the sum of individual sparsity inducing norms (e.g. \ell_1, nuclear norm). Our new tractable formulation for low-rank tensor recovery shows how the sample complexity can be reduced by designing convex regularizers that exploit several structures jointly.

Cite this Paper


BibTeX
@InProceedings{pmlr-v32-mu14, title = {Square Deal: Lower Bounds and Improved Relaxations for Tensor Recovery}, author = {Mu, Cun and Huang, Bo and Wright, John and Goldfarb, Donald}, booktitle = {Proceedings of the 31st International Conference on Machine Learning}, pages = {73--81}, year = {2014}, editor = {Xing, Eric P. and Jebara, Tony}, volume = {32}, number = {2}, series = {Proceedings of Machine Learning Research}, address = {Bejing, China}, month = {22--24 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v32/mu14.pdf}, url = {https://proceedings.mlr.press/v32/mu14.html}, abstract = {Recovering a low-rank tensor from incomplete information is a recurring problem in signal processing and machine learning. The most popular convex relaxation of this problem minimizes the sum of the nuclear norms (SNN) of the unfolding matrices of the tensor. We show that this approach can be substantially suboptimal: reliably recovering a K-way n\timesn\times⋯\times n tensor of Tucker rank (r, r, \ldots, r) from Gaussian measurements requires Ω( r n^K-1 ) observations. In contrast, a certain (intractable) nonconvex formulation needs only O(r^K + nrK) observations. We introduce a simple, new convex relaxation, which partially bridges this gap. Our new formulation succeeds with O(r^⌊K/2 ⌋n^⌈K/2 ⌉) observations. The lower bound for the SNN model follows from our new result on recovering signals with multiple structures (e.g. sparse, low rank), which indicates the significant suboptimality of the common approach of minimizing the sum of individual sparsity inducing norms (e.g. \ell_1, nuclear norm). Our new tractable formulation for low-rank tensor recovery shows how the sample complexity can be reduced by designing convex regularizers that exploit several structures jointly.} }
Endnote
%0 Conference Paper %T Square Deal: Lower Bounds and Improved Relaxations for Tensor Recovery %A Cun Mu %A Bo Huang %A John Wright %A Donald Goldfarb %B Proceedings of the 31st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2014 %E Eric P. Xing %E Tony Jebara %F pmlr-v32-mu14 %I PMLR %P 73--81 %U https://proceedings.mlr.press/v32/mu14.html %V 32 %N 2 %X Recovering a low-rank tensor from incomplete information is a recurring problem in signal processing and machine learning. The most popular convex relaxation of this problem minimizes the sum of the nuclear norms (SNN) of the unfolding matrices of the tensor. We show that this approach can be substantially suboptimal: reliably recovering a K-way n\timesn\times⋯\times n tensor of Tucker rank (r, r, \ldots, r) from Gaussian measurements requires Ω( r n^K-1 ) observations. In contrast, a certain (intractable) nonconvex formulation needs only O(r^K + nrK) observations. We introduce a simple, new convex relaxation, which partially bridges this gap. Our new formulation succeeds with O(r^⌊K/2 ⌋n^⌈K/2 ⌉) observations. The lower bound for the SNN model follows from our new result on recovering signals with multiple structures (e.g. sparse, low rank), which indicates the significant suboptimality of the common approach of minimizing the sum of individual sparsity inducing norms (e.g. \ell_1, nuclear norm). Our new tractable formulation for low-rank tensor recovery shows how the sample complexity can be reduced by designing convex regularizers that exploit several structures jointly.
RIS
TY - CPAPER TI - Square Deal: Lower Bounds and Improved Relaxations for Tensor Recovery AU - Cun Mu AU - Bo Huang AU - John Wright AU - Donald Goldfarb BT - Proceedings of the 31st International Conference on Machine Learning DA - 2014/06/18 ED - Eric P. Xing ED - Tony Jebara ID - pmlr-v32-mu14 PB - PMLR DP - Proceedings of Machine Learning Research VL - 32 IS - 2 SP - 73 EP - 81 L1 - http://proceedings.mlr.press/v32/mu14.pdf UR - https://proceedings.mlr.press/v32/mu14.html AB - Recovering a low-rank tensor from incomplete information is a recurring problem in signal processing and machine learning. The most popular convex relaxation of this problem minimizes the sum of the nuclear norms (SNN) of the unfolding matrices of the tensor. We show that this approach can be substantially suboptimal: reliably recovering a K-way n\timesn\times⋯\times n tensor of Tucker rank (r, r, \ldots, r) from Gaussian measurements requires Ω( r n^K-1 ) observations. In contrast, a certain (intractable) nonconvex formulation needs only O(r^K + nrK) observations. We introduce a simple, new convex relaxation, which partially bridges this gap. Our new formulation succeeds with O(r^⌊K/2 ⌋n^⌈K/2 ⌉) observations. The lower bound for the SNN model follows from our new result on recovering signals with multiple structures (e.g. sparse, low rank), which indicates the significant suboptimality of the common approach of minimizing the sum of individual sparsity inducing norms (e.g. \ell_1, nuclear norm). Our new tractable formulation for low-rank tensor recovery shows how the sample complexity can be reduced by designing convex regularizers that exploit several structures jointly. ER -
APA
Mu, C., Huang, B., Wright, J. & Goldfarb, D.. (2014). Square Deal: Lower Bounds and Improved Relaxations for Tensor Recovery. Proceedings of the 31st International Conference on Machine Learning, in Proceedings of Machine Learning Research 32(2):73-81 Available from https://proceedings.mlr.press/v32/mu14.html.

Related Material