Preconditioned Riemannian Gradient Descent Algorithm for Low-Multilinear-Rank Tensor Completion

Yuanwei Zhang, Fengmiao Bian, Xiaoqun Zhang, Jian-Feng Cai
Proceedings of the 42nd International Conference on Machine Learning, PMLR 267:74490-74514, 2025.

Abstract

Tensors play a crucial role in numerous scientific and engineering fields. This paper addresses the low-multilinear-rank tensor completion problem, a fundamental task in tensor-related applications. By exploiting the manifold structure inherent to the fixed-multilinear-rank tensor set, we introduce a simple yet highly effective preconditioned Riemannian metric and propose the Preconditioned Riemannian Gradient Descent (PRGD) algorithm. Compared to the standard Riemannian Gradient Descent (RGD), PRGD achieves faster convergence while maintaining the same order of per-iteration computational complexity. Theoretically, we provide the recovery guarantee for PRGD under near-optimal sampling complexity. Numerical results highlight the efficiency of PRGD, outperforming state-of-the-art methods on both synthetic data and real-world video inpainting tasks.

Cite this Paper


BibTeX
@InProceedings{pmlr-v267-zhang25c, title = {Preconditioned {R}iemannian Gradient Descent Algorithm for Low-Multilinear-Rank Tensor Completion}, author = {Zhang, Yuanwei and Bian, Fengmiao and Zhang, Xiaoqun and Cai, Jian-Feng}, booktitle = {Proceedings of the 42nd International Conference on Machine Learning}, pages = {74490--74514}, year = {2025}, editor = {Singh, Aarti and Fazel, Maryam and Hsu, Daniel and Lacoste-Julien, Simon and Berkenkamp, Felix and Maharaj, Tegan and Wagstaff, Kiri and Zhu, Jerry}, volume = {267}, series = {Proceedings of Machine Learning Research}, month = {13--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v267/main/assets/zhang25c/zhang25c.pdf}, url = {https://proceedings.mlr.press/v267/zhang25c.html}, abstract = {Tensors play a crucial role in numerous scientific and engineering fields. This paper addresses the low-multilinear-rank tensor completion problem, a fundamental task in tensor-related applications. By exploiting the manifold structure inherent to the fixed-multilinear-rank tensor set, we introduce a simple yet highly effective preconditioned Riemannian metric and propose the Preconditioned Riemannian Gradient Descent (PRGD) algorithm. Compared to the standard Riemannian Gradient Descent (RGD), PRGD achieves faster convergence while maintaining the same order of per-iteration computational complexity. Theoretically, we provide the recovery guarantee for PRGD under near-optimal sampling complexity. Numerical results highlight the efficiency of PRGD, outperforming state-of-the-art methods on both synthetic data and real-world video inpainting tasks.} }
Endnote
%0 Conference Paper %T Preconditioned Riemannian Gradient Descent Algorithm for Low-Multilinear-Rank Tensor Completion %A Yuanwei Zhang %A Fengmiao Bian %A Xiaoqun Zhang %A Jian-Feng Cai %B Proceedings of the 42nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Aarti Singh %E Maryam Fazel %E Daniel Hsu %E Simon Lacoste-Julien %E Felix Berkenkamp %E Tegan Maharaj %E Kiri Wagstaff %E Jerry Zhu %F pmlr-v267-zhang25c %I PMLR %P 74490--74514 %U https://proceedings.mlr.press/v267/zhang25c.html %V 267 %X Tensors play a crucial role in numerous scientific and engineering fields. This paper addresses the low-multilinear-rank tensor completion problem, a fundamental task in tensor-related applications. By exploiting the manifold structure inherent to the fixed-multilinear-rank tensor set, we introduce a simple yet highly effective preconditioned Riemannian metric and propose the Preconditioned Riemannian Gradient Descent (PRGD) algorithm. Compared to the standard Riemannian Gradient Descent (RGD), PRGD achieves faster convergence while maintaining the same order of per-iteration computational complexity. Theoretically, we provide the recovery guarantee for PRGD under near-optimal sampling complexity. Numerical results highlight the efficiency of PRGD, outperforming state-of-the-art methods on both synthetic data and real-world video inpainting tasks.
APA
Zhang, Y., Bian, F., Zhang, X. & Cai, J.. (2025). Preconditioned Riemannian Gradient Descent Algorithm for Low-Multilinear-Rank Tensor Completion. Proceedings of the 42nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 267:74490-74514 Available from https://proceedings.mlr.press/v267/zhang25c.html.

Related Material