Fast Tensor Completion via Approximate Richardson Iteration

Mehrdad Ghadiri, Matthew Fahrbach, Yunbum Kook, Ali Jadbabaie
Proceedings of the 42nd International Conference on Machine Learning, PMLR 267:19248-19265, 2025.

Abstract

We study tensor completion (TC) through the lens of low-rank tensor decomposition (TD). Many TD algorithms use fast alternating minimization methods to solve highly structured linear regression problems at each step (e.g., for CP, Tucker, and tensor-train decompositions). However, such algebraic structure is often lost in TC regression problems, making direct extensions unclear. This work proposes a novel lifting method for approximately solving TC regression problems using structured TD regression algorithms as blackbox subroutines, enabling sublinear-time methods. We analyze the convergence rate of our approximate Richardson iteration-based algorithm, and our empirical study shows that it can be 100x faster than direct methods for CP completion on real-world tensors.

Cite this Paper


BibTeX
@InProceedings{pmlr-v267-ghadiri25a, title = {Fast Tensor Completion via Approximate Richardson Iteration}, author = {Ghadiri, Mehrdad and Fahrbach, Matthew and Kook, Yunbum and Jadbabaie, Ali}, booktitle = {Proceedings of the 42nd International Conference on Machine Learning}, pages = {19248--19265}, year = {2025}, editor = {Singh, Aarti and Fazel, Maryam and Hsu, Daniel and Lacoste-Julien, Simon and Berkenkamp, Felix and Maharaj, Tegan and Wagstaff, Kiri and Zhu, Jerry}, volume = {267}, series = {Proceedings of Machine Learning Research}, month = {13--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v267/main/assets/ghadiri25a/ghadiri25a.pdf}, url = {https://proceedings.mlr.press/v267/ghadiri25a.html}, abstract = {We study tensor completion (TC) through the lens of low-rank tensor decomposition (TD). Many TD algorithms use fast alternating minimization methods to solve highly structured linear regression problems at each step (e.g., for CP, Tucker, and tensor-train decompositions). However, such algebraic structure is often lost in TC regression problems, making direct extensions unclear. This work proposes a novel lifting method for approximately solving TC regression problems using structured TD regression algorithms as blackbox subroutines, enabling sublinear-time methods. We analyze the convergence rate of our approximate Richardson iteration-based algorithm, and our empirical study shows that it can be 100x faster than direct methods for CP completion on real-world tensors.} }
Endnote
%0 Conference Paper %T Fast Tensor Completion via Approximate Richardson Iteration %A Mehrdad Ghadiri %A Matthew Fahrbach %A Yunbum Kook %A Ali Jadbabaie %B Proceedings of the 42nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Aarti Singh %E Maryam Fazel %E Daniel Hsu %E Simon Lacoste-Julien %E Felix Berkenkamp %E Tegan Maharaj %E Kiri Wagstaff %E Jerry Zhu %F pmlr-v267-ghadiri25a %I PMLR %P 19248--19265 %U https://proceedings.mlr.press/v267/ghadiri25a.html %V 267 %X We study tensor completion (TC) through the lens of low-rank tensor decomposition (TD). Many TD algorithms use fast alternating minimization methods to solve highly structured linear regression problems at each step (e.g., for CP, Tucker, and tensor-train decompositions). However, such algebraic structure is often lost in TC regression problems, making direct extensions unclear. This work proposes a novel lifting method for approximately solving TC regression problems using structured TD regression algorithms as blackbox subroutines, enabling sublinear-time methods. We analyze the convergence rate of our approximate Richardson iteration-based algorithm, and our empirical study shows that it can be 100x faster than direct methods for CP completion on real-world tensors.
APA
Ghadiri, M., Fahrbach, M., Kook, Y. & Jadbabaie, A.. (2025). Fast Tensor Completion via Approximate Richardson Iteration. Proceedings of the 42nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 267:19248-19265 Available from https://proceedings.mlr.press/v267/ghadiri25a.html.

Related Material