Sparse and Low-rank Tensor Estimation via Cubic Sketchings

Botao Hao, Anru R. Zhang, Guang Cheng
Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, PMLR 108:1319-1330, 2020.

Abstract

In this paper, we propose a general framework for sparse and low-rank tensor estimation from cubic sketchings. A two-stage non-convex implementation is developed based on sparse tensor decomposition and thresholded gradient descent, which ensures exact recovery in the noiseless case and stable recovery in the noisy case with high probability. The non-asymptotic analysis sheds light on an interplay between optimization error and statistical error. The proposed procedure is shown to be rate-optimal under certain conditions. As a technical by-product, novel high-order concentration inequalities are derived for studying high-moment sub-Gaussian tensors. An interesting tensor formulation illustrates the potential application to high-order interaction pursuit in high-dimensional linear regression.

Cite this Paper


BibTeX
@InProceedings{pmlr-v108-hao20a, title = {Sparse and Low-rank Tensor Estimation via Cubic Sketchings}, author = {Hao, Botao and Zhang, Anru R. and Cheng, Guang}, booktitle = {Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics}, pages = {1319--1330}, year = {2020}, editor = {Silvia Chiappa and Roberto Calandra}, volume = {108}, series = {Proceedings of Machine Learning Research}, month = {26--28 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v108/hao20a/hao20a.pdf}, url = { http://proceedings.mlr.press/v108/hao20a.html }, abstract = { In this paper, we propose a general framework for sparse and low-rank tensor estimation from cubic sketchings. A two-stage non-convex implementation is developed based on sparse tensor decomposition and thresholded gradient descent, which ensures exact recovery in the noiseless case and stable recovery in the noisy case with high probability. The non-asymptotic analysis sheds light on an interplay between optimization error and statistical error. The proposed procedure is shown to be rate-optimal under certain conditions. As a technical by-product, novel high-order concentration inequalities are derived for studying high-moment sub-Gaussian tensors. An interesting tensor formulation illustrates the potential application to high-order interaction pursuit in high-dimensional linear regression.} }
Endnote
%0 Conference Paper %T Sparse and Low-rank Tensor Estimation via Cubic Sketchings %A Botao Hao %A Anru R. Zhang %A Guang Cheng %B Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2020 %E Silvia Chiappa %E Roberto Calandra %F pmlr-v108-hao20a %I PMLR %P 1319--1330 %U http://proceedings.mlr.press/v108/hao20a.html %V 108 %X In this paper, we propose a general framework for sparse and low-rank tensor estimation from cubic sketchings. A two-stage non-convex implementation is developed based on sparse tensor decomposition and thresholded gradient descent, which ensures exact recovery in the noiseless case and stable recovery in the noisy case with high probability. The non-asymptotic analysis sheds light on an interplay between optimization error and statistical error. The proposed procedure is shown to be rate-optimal under certain conditions. As a technical by-product, novel high-order concentration inequalities are derived for studying high-moment sub-Gaussian tensors. An interesting tensor formulation illustrates the potential application to high-order interaction pursuit in high-dimensional linear regression.
APA
Hao, B., Zhang, A.R. & Cheng, G.. (2020). Sparse and Low-rank Tensor Estimation via Cubic Sketchings. Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 108:1319-1330 Available from http://proceedings.mlr.press/v108/hao20a.html .

Related Material