When Can We Solve the Weighted Low Rank Approximation Problem in Truly Subquadratic Time?

Chenyang Li, Yingyu Liang, Zhenmei Shi, Zhao Song
Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, PMLR 258:2710-2718, 2025.

Abstract

The weighted low-rank approximation problem is a fundamental numerical linear algebra problem and has many applications in machine learning. Given a $n \times n$ weight matrix $W$ and a $n \times n$ matrix $A$, the goal is to find two low-rank matrices $U, V \in \mathbb{R}^{n \times k}$ such that the cost of $\\|{W} \circ (U V^\top - A) \\|_F^2$ is minimized. Previous work has to pay $\Omega(n^2)$ time when matrices $A$ and $W$ are dense, e.g., having $\Omega(n^2)$ non-zero entries. In this work, we show that there is a certain regime, even if $A$ and $W$ are dense, we can still hope to solve the weighted low-rank approximation problem in almost linear $n^{1+o(1)}$ time.

Cite this Paper


BibTeX
@InProceedings{pmlr-v258-li25f, title = {When Can We Solve the Weighted Low Rank Approximation Problem in Truly Subquadratic Time?}, author = {Li, Chenyang and Liang, Yingyu and Shi, Zhenmei and Song, Zhao}, booktitle = {Proceedings of The 28th International Conference on Artificial Intelligence and Statistics}, pages = {2710--2718}, year = {2025}, editor = {Li, Yingzhen and Mandt, Stephan and Agrawal, Shipra and Khan, Emtiyaz}, volume = {258}, series = {Proceedings of Machine Learning Research}, month = {03--05 May}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v258/main/assets/li25f/li25f.pdf}, url = {https://proceedings.mlr.press/v258/li25f.html}, abstract = {The weighted low-rank approximation problem is a fundamental numerical linear algebra problem and has many applications in machine learning. Given a $n \times n$ weight matrix $W$ and a $n \times n$ matrix $A$, the goal is to find two low-rank matrices $U, V \in \mathbb{R}^{n \times k}$ such that the cost of $\\|{W} \circ (U V^\top - A) \\|_F^2$ is minimized. Previous work has to pay $\Omega(n^2)$ time when matrices $A$ and $W$ are dense, e.g., having $\Omega(n^2)$ non-zero entries. In this work, we show that there is a certain regime, even if $A$ and $W$ are dense, we can still hope to solve the weighted low-rank approximation problem in almost linear $n^{1+o(1)}$ time.} }
Endnote
%0 Conference Paper %T When Can We Solve the Weighted Low Rank Approximation Problem in Truly Subquadratic Time? %A Chenyang Li %A Yingyu Liang %A Zhenmei Shi %A Zhao Song %B Proceedings of The 28th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2025 %E Yingzhen Li %E Stephan Mandt %E Shipra Agrawal %E Emtiyaz Khan %F pmlr-v258-li25f %I PMLR %P 2710--2718 %U https://proceedings.mlr.press/v258/li25f.html %V 258 %X The weighted low-rank approximation problem is a fundamental numerical linear algebra problem and has many applications in machine learning. Given a $n \times n$ weight matrix $W$ and a $n \times n$ matrix $A$, the goal is to find two low-rank matrices $U, V \in \mathbb{R}^{n \times k}$ such that the cost of $\\|{W} \circ (U V^\top - A) \\|_F^2$ is minimized. Previous work has to pay $\Omega(n^2)$ time when matrices $A$ and $W$ are dense, e.g., having $\Omega(n^2)$ non-zero entries. In this work, we show that there is a certain regime, even if $A$ and $W$ are dense, we can still hope to solve the weighted low-rank approximation problem in almost linear $n^{1+o(1)}$ time.
APA
Li, C., Liang, Y., Shi, Z. & Song, Z.. (2025). When Can We Solve the Weighted Low Rank Approximation Problem in Truly Subquadratic Time?. Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 258:2710-2718 Available from https://proceedings.mlr.press/v258/li25f.html.

Related Material