Low-Rank Tensor Transitions (LoRT) for Transferable Tensor Regression

Andong Wang, Yuning Qiu, Zhong Jin, Guoxu Zhou, Qibin Zhao
Proceedings of the 42nd International Conference on Machine Learning, PMLR 267:64152-64201, 2025.

Abstract

Tensor regression is a powerful tool for analyzing complex multi-dimensional data in fields such as neuroimaging and spatiotemporal analysis, but its effectiveness is often hindered by insufficient sample sizes. To overcome this limitation, we adopt a transfer learning strategy that leverages knowledge from related source tasks to improve performance in data-scarce target tasks. This approach, however, introduces additional challenges including model shifts, covariate shifts, and decentralized data management. We propose the Low-Rank Tensor Transitions (LoRT) framework, which incorporates a novel fusion regularizer and a two-step refinement to enable robust adaptation while preserving low-tubal-rank structure. To support decentralized scenarios, we extend LoRT to D-LoRT, a distributed variant that maintains statistical efficiency with minimal communication overhead. Theoretical analysis and experiments on tensor regression tasks, including compressed sensing and completion, validate the robustness and versatility of the proposed methods. These findings indicate the potential of LoRT as a robust method for tensor regression in settings with limited data and complex distributional structures.

Cite this Paper


BibTeX
@InProceedings{pmlr-v267-wang25cj, title = {Low-Rank Tensor Transitions ({L}o{RT}) for Transferable Tensor Regression}, author = {Wang, Andong and Qiu, Yuning and Jin, Zhong and Zhou, Guoxu and Zhao, Qibin}, booktitle = {Proceedings of the 42nd International Conference on Machine Learning}, pages = {64152--64201}, year = {2025}, editor = {Singh, Aarti and Fazel, Maryam and Hsu, Daniel and Lacoste-Julien, Simon and Berkenkamp, Felix and Maharaj, Tegan and Wagstaff, Kiri and Zhu, Jerry}, volume = {267}, series = {Proceedings of Machine Learning Research}, month = {13--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v267/main/assets/wang25cj/wang25cj.pdf}, url = {https://proceedings.mlr.press/v267/wang25cj.html}, abstract = {Tensor regression is a powerful tool for analyzing complex multi-dimensional data in fields such as neuroimaging and spatiotemporal analysis, but its effectiveness is often hindered by insufficient sample sizes. To overcome this limitation, we adopt a transfer learning strategy that leverages knowledge from related source tasks to improve performance in data-scarce target tasks. This approach, however, introduces additional challenges including model shifts, covariate shifts, and decentralized data management. We propose the Low-Rank Tensor Transitions (LoRT) framework, which incorporates a novel fusion regularizer and a two-step refinement to enable robust adaptation while preserving low-tubal-rank structure. To support decentralized scenarios, we extend LoRT to D-LoRT, a distributed variant that maintains statistical efficiency with minimal communication overhead. Theoretical analysis and experiments on tensor regression tasks, including compressed sensing and completion, validate the robustness and versatility of the proposed methods. These findings indicate the potential of LoRT as a robust method for tensor regression in settings with limited data and complex distributional structures.} }
Endnote
%0 Conference Paper %T Low-Rank Tensor Transitions (LoRT) for Transferable Tensor Regression %A Andong Wang %A Yuning Qiu %A Zhong Jin %A Guoxu Zhou %A Qibin Zhao %B Proceedings of the 42nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Aarti Singh %E Maryam Fazel %E Daniel Hsu %E Simon Lacoste-Julien %E Felix Berkenkamp %E Tegan Maharaj %E Kiri Wagstaff %E Jerry Zhu %F pmlr-v267-wang25cj %I PMLR %P 64152--64201 %U https://proceedings.mlr.press/v267/wang25cj.html %V 267 %X Tensor regression is a powerful tool for analyzing complex multi-dimensional data in fields such as neuroimaging and spatiotemporal analysis, but its effectiveness is often hindered by insufficient sample sizes. To overcome this limitation, we adopt a transfer learning strategy that leverages knowledge from related source tasks to improve performance in data-scarce target tasks. This approach, however, introduces additional challenges including model shifts, covariate shifts, and decentralized data management. We propose the Low-Rank Tensor Transitions (LoRT) framework, which incorporates a novel fusion regularizer and a two-step refinement to enable robust adaptation while preserving low-tubal-rank structure. To support decentralized scenarios, we extend LoRT to D-LoRT, a distributed variant that maintains statistical efficiency with minimal communication overhead. Theoretical analysis and experiments on tensor regression tasks, including compressed sensing and completion, validate the robustness and versatility of the proposed methods. These findings indicate the potential of LoRT as a robust method for tensor regression in settings with limited data and complex distributional structures.
APA
Wang, A., Qiu, Y., Jin, Z., Zhou, G. & Zhao, Q.. (2025). Low-Rank Tensor Transitions (LoRT) for Transferable Tensor Regression. Proceedings of the 42nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 267:64152-64201 Available from https://proceedings.mlr.press/v267/wang25cj.html.

Related Material