Efficient Nonconvex Regularized Tensor Completion with Structure-aware Proximal Iterations

Quanming Yao, James Tin-Yau Kwok, Bo Han
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:7035-7044, 2019.

Abstract

Nonconvex regularizers have been successfully used in low-rank matrix learning. In this paper, we extend this to the more challenging problem of low-rank tensor completion. Based on the proximal average algorithm, we develop an efficient solver that avoids expensive tensor folding and unfolding. A special “sparse plus low-rank" structure, which is essential for fast computation of individual proximal steps, is maintained throughout the iterations. We also incorporate adaptive momentum to further speed up empirical convergence. Convergence results to critical points are provided under smoothness and Kurdyka-Lojasiewicz conditions. Experimental results on a number of synthetic and real-world data sets show that the proposed algorithm is more efficient in both time and space, and is also more accurate than existing approaches.

Cite this Paper


BibTeX
@InProceedings{pmlr-v97-yao19a, title = {Efficient Nonconvex Regularized Tensor Completion with Structure-aware Proximal Iterations}, author = {Yao, Quanming and Kwok, James Tin-Yau and Han, Bo}, booktitle = {Proceedings of the 36th International Conference on Machine Learning}, pages = {7035--7044}, year = {2019}, editor = {Chaudhuri, Kamalika and Salakhutdinov, Ruslan}, volume = {97}, series = {Proceedings of Machine Learning Research}, month = {09--15 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v97/yao19a/yao19a.pdf}, url = {https://proceedings.mlr.press/v97/yao19a.html}, abstract = {Nonconvex regularizers have been successfully used in low-rank matrix learning. In this paper, we extend this to the more challenging problem of low-rank tensor completion. Based on the proximal average algorithm, we develop an efficient solver that avoids expensive tensor folding and unfolding. A special “sparse plus low-rank" structure, which is essential for fast computation of individual proximal steps, is maintained throughout the iterations. We also incorporate adaptive momentum to further speed up empirical convergence. Convergence results to critical points are provided under smoothness and Kurdyka-Lojasiewicz conditions. Experimental results on a number of synthetic and real-world data sets show that the proposed algorithm is more efficient in both time and space, and is also more accurate than existing approaches.} }
Endnote
%0 Conference Paper %T Efficient Nonconvex Regularized Tensor Completion with Structure-aware Proximal Iterations %A Quanming Yao %A James Tin-Yau Kwok %A Bo Han %B Proceedings of the 36th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Ruslan Salakhutdinov %F pmlr-v97-yao19a %I PMLR %P 7035--7044 %U https://proceedings.mlr.press/v97/yao19a.html %V 97 %X Nonconvex regularizers have been successfully used in low-rank matrix learning. In this paper, we extend this to the more challenging problem of low-rank tensor completion. Based on the proximal average algorithm, we develop an efficient solver that avoids expensive tensor folding and unfolding. A special “sparse plus low-rank" structure, which is essential for fast computation of individual proximal steps, is maintained throughout the iterations. We also incorporate adaptive momentum to further speed up empirical convergence. Convergence results to critical points are provided under smoothness and Kurdyka-Lojasiewicz conditions. Experimental results on a number of synthetic and real-world data sets show that the proposed algorithm is more efficient in both time and space, and is also more accurate than existing approaches.
APA
Yao, Q., Kwok, J.T. & Han, B.. (2019). Efficient Nonconvex Regularized Tensor Completion with Structure-aware Proximal Iterations. Proceedings of the 36th International Conference on Machine Learning, in Proceedings of Machine Learning Research 97:7035-7044 Available from https://proceedings.mlr.press/v97/yao19a.html.

Related Material