Recovery Guarantees For Quadratic Tensors With Sparse Observations

Hongyang Zhang, Vatsal Sharan, Moses Charikar, Yingyu Liang
Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics, PMLR 89:3322-3332, 2019.

Abstract

We consider the tensor completion problem of predicting the missing entries of a tensor. The commonly used CP model has a triple product form, but an alternate family of quadratic models which are the sum of pairwise products instead of a triple product have emerged from applications such as recommendation systems. Non-convex methods are the method of choice for learning quadratic models, and this work examines their sample complexity and error guarantee. Our main result is that with the number of samples being only linear in the dimension, all local minima of the mean squared error objective are global minima and recover the original tensor. We substantiate our theoretical results with experiments on synthetic and real-world data.

Cite this Paper


BibTeX
@InProceedings{pmlr-v89-zhang19h, title = {Recovery Guarantees For Quadratic Tensors With Sparse Observations}, author = {Zhang, Hongyang and Sharan, Vatsal and Charikar, Moses and Liang, Yingyu}, booktitle = {Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics}, pages = {3322--3332}, year = {2019}, editor = {Chaudhuri, Kamalika and Sugiyama, Masashi}, volume = {89}, series = {Proceedings of Machine Learning Research}, month = {16--18 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v89/zhang19h/zhang19h.pdf}, url = {https://proceedings.mlr.press/v89/zhang19h.html}, abstract = {We consider the tensor completion problem of predicting the missing entries of a tensor. The commonly used CP model has a triple product form, but an alternate family of quadratic models which are the sum of pairwise products instead of a triple product have emerged from applications such as recommendation systems. Non-convex methods are the method of choice for learning quadratic models, and this work examines their sample complexity and error guarantee. Our main result is that with the number of samples being only linear in the dimension, all local minima of the mean squared error objective are global minima and recover the original tensor. We substantiate our theoretical results with experiments on synthetic and real-world data.} }
Endnote
%0 Conference Paper %T Recovery Guarantees For Quadratic Tensors With Sparse Observations %A Hongyang Zhang %A Vatsal Sharan %A Moses Charikar %A Yingyu Liang %B Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Masashi Sugiyama %F pmlr-v89-zhang19h %I PMLR %P 3322--3332 %U https://proceedings.mlr.press/v89/zhang19h.html %V 89 %X We consider the tensor completion problem of predicting the missing entries of a tensor. The commonly used CP model has a triple product form, but an alternate family of quadratic models which are the sum of pairwise products instead of a triple product have emerged from applications such as recommendation systems. Non-convex methods are the method of choice for learning quadratic models, and this work examines their sample complexity and error guarantee. Our main result is that with the number of samples being only linear in the dimension, all local minima of the mean squared error objective are global minima and recover the original tensor. We substantiate our theoretical results with experiments on synthetic and real-world data.
APA
Zhang, H., Sharan, V., Charikar, M. & Liang, Y.. (2019). Recovery Guarantees For Quadratic Tensors With Sparse Observations. Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 89:3322-3332 Available from https://proceedings.mlr.press/v89/zhang19h.html.

Related Material