A Scalable Second Order Method for Ill-Conditioned Matrix Completion from Few Samples

Christian Kümmerle, Claudio M. Verdun
Proceedings of the 38th International Conference on Machine Learning, PMLR 139:5872-5883, 2021.

Abstract

We propose an iterative algorithm for low-rank matrix completion with that can be interpreted as an iteratively reweighted least squares (IRLS) algorithm, a saddle-escaping smoothing Newton method or a variable metric proximal gradient method applied to a non-convex rank surrogate. It combines the favorable data-efficiency of previous IRLS approaches with an improved scalability by several orders of magnitude. We establish the first local convergence guarantee from a minimal number of samples for that class of algorithms, showing that the method attains a local quadratic convergence rate. Furthermore, we show that the linear systems to be solved are well-conditioned even for very ill-conditioned ground truth matrices. We provide extensive experiments, indicating that unlike many state-of-the-art approaches, our method is able to complete very ill-conditioned matrices with a condition number of up to $10^{10}$ from few samples, while being competitive in its scalability.

Cite this Paper


BibTeX
@InProceedings{pmlr-v139-kummerle21a, title = {A Scalable Second Order Method for Ill-Conditioned Matrix Completion from Few Samples}, author = {K{\"u}mmerle, Christian and Verdun, Claudio M.}, booktitle = {Proceedings of the 38th International Conference on Machine Learning}, pages = {5872--5883}, year = {2021}, editor = {Meila, Marina and Zhang, Tong}, volume = {139}, series = {Proceedings of Machine Learning Research}, month = {18--24 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v139/kummerle21a/kummerle21a.pdf}, url = {https://proceedings.mlr.press/v139/kummerle21a.html}, abstract = {We propose an iterative algorithm for low-rank matrix completion with that can be interpreted as an iteratively reweighted least squares (IRLS) algorithm, a saddle-escaping smoothing Newton method or a variable metric proximal gradient method applied to a non-convex rank surrogate. It combines the favorable data-efficiency of previous IRLS approaches with an improved scalability by several orders of magnitude. We establish the first local convergence guarantee from a minimal number of samples for that class of algorithms, showing that the method attains a local quadratic convergence rate. Furthermore, we show that the linear systems to be solved are well-conditioned even for very ill-conditioned ground truth matrices. We provide extensive experiments, indicating that unlike many state-of-the-art approaches, our method is able to complete very ill-conditioned matrices with a condition number of up to $10^{10}$ from few samples, while being competitive in its scalability.} }
Endnote
%0 Conference Paper %T A Scalable Second Order Method for Ill-Conditioned Matrix Completion from Few Samples %A Christian Kümmerle %A Claudio M. Verdun %B Proceedings of the 38th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2021 %E Marina Meila %E Tong Zhang %F pmlr-v139-kummerle21a %I PMLR %P 5872--5883 %U https://proceedings.mlr.press/v139/kummerle21a.html %V 139 %X We propose an iterative algorithm for low-rank matrix completion with that can be interpreted as an iteratively reweighted least squares (IRLS) algorithm, a saddle-escaping smoothing Newton method or a variable metric proximal gradient method applied to a non-convex rank surrogate. It combines the favorable data-efficiency of previous IRLS approaches with an improved scalability by several orders of magnitude. We establish the first local convergence guarantee from a minimal number of samples for that class of algorithms, showing that the method attains a local quadratic convergence rate. Furthermore, we show that the linear systems to be solved are well-conditioned even for very ill-conditioned ground truth matrices. We provide extensive experiments, indicating that unlike many state-of-the-art approaches, our method is able to complete very ill-conditioned matrices with a condition number of up to $10^{10}$ from few samples, while being competitive in its scalability.
APA
Kümmerle, C. & Verdun, C.M.. (2021). A Scalable Second Order Method for Ill-Conditioned Matrix Completion from Few Samples. Proceedings of the 38th International Conference on Machine Learning, in Proceedings of Machine Learning Research 139:5872-5883 Available from https://proceedings.mlr.press/v139/kummerle21a.html.

Related Material