Inductive Matrix Completion: No Bad Local Minima and a Fast Algorithm

Pini Zilber, Boaz Nadler
Proceedings of the 39th International Conference on Machine Learning, PMLR 162:27671-27692, 2022.

Abstract

The inductive matrix completion (IMC) problem is to recover a low rank matrix from few observed entries while incorporating prior knowledge about its row and column subspaces. In this work, we make three contributions to the IMC problem: (i) we prove that under suitable conditions, the IMC optimization landscape has no bad local minima; (ii) we derive a simple scheme with theoretical guarantees to estimate the rank of the unknown matrix; and (iii) we propose GNIMC, a simple Gauss-Newton based method to solve the IMC problem, analyze its runtime and derive for it strong recovery guarantees. The guarantees for GNIMC are sharper in several aspects than those available for other methods, including a quadratic convergence rate, fewer required observed entries and stability to errors or deviations from low-rank. Empirically, given entries observed uniformly at random, GNIMC recovers the underlying matrix substantially faster than several competing methods.

Cite this Paper


BibTeX
@InProceedings{pmlr-v162-zilber22a, title = {Inductive Matrix Completion: No Bad Local Minima and a Fast Algorithm}, author = {Zilber, Pini and Nadler, Boaz}, booktitle = {Proceedings of the 39th International Conference on Machine Learning}, pages = {27671--27692}, year = {2022}, editor = {Chaudhuri, Kamalika and Jegelka, Stefanie and Song, Le and Szepesvari, Csaba and Niu, Gang and Sabato, Sivan}, volume = {162}, series = {Proceedings of Machine Learning Research}, month = {17--23 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v162/zilber22a/zilber22a.pdf}, url = {https://proceedings.mlr.press/v162/zilber22a.html}, abstract = {The inductive matrix completion (IMC) problem is to recover a low rank matrix from few observed entries while incorporating prior knowledge about its row and column subspaces. In this work, we make three contributions to the IMC problem: (i) we prove that under suitable conditions, the IMC optimization landscape has no bad local minima; (ii) we derive a simple scheme with theoretical guarantees to estimate the rank of the unknown matrix; and (iii) we propose GNIMC, a simple Gauss-Newton based method to solve the IMC problem, analyze its runtime and derive for it strong recovery guarantees. The guarantees for GNIMC are sharper in several aspects than those available for other methods, including a quadratic convergence rate, fewer required observed entries and stability to errors or deviations from low-rank. Empirically, given entries observed uniformly at random, GNIMC recovers the underlying matrix substantially faster than several competing methods.} }
Endnote
%0 Conference Paper %T Inductive Matrix Completion: No Bad Local Minima and a Fast Algorithm %A Pini Zilber %A Boaz Nadler %B Proceedings of the 39th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2022 %E Kamalika Chaudhuri %E Stefanie Jegelka %E Le Song %E Csaba Szepesvari %E Gang Niu %E Sivan Sabato %F pmlr-v162-zilber22a %I PMLR %P 27671--27692 %U https://proceedings.mlr.press/v162/zilber22a.html %V 162 %X The inductive matrix completion (IMC) problem is to recover a low rank matrix from few observed entries while incorporating prior knowledge about its row and column subspaces. In this work, we make three contributions to the IMC problem: (i) we prove that under suitable conditions, the IMC optimization landscape has no bad local minima; (ii) we derive a simple scheme with theoretical guarantees to estimate the rank of the unknown matrix; and (iii) we propose GNIMC, a simple Gauss-Newton based method to solve the IMC problem, analyze its runtime and derive for it strong recovery guarantees. The guarantees for GNIMC are sharper in several aspects than those available for other methods, including a quadratic convergence rate, fewer required observed entries and stability to errors or deviations from low-rank. Empirically, given entries observed uniformly at random, GNIMC recovers the underlying matrix substantially faster than several competing methods.
APA
Zilber, P. & Nadler, B.. (2022). Inductive Matrix Completion: No Bad Local Minima and a Fast Algorithm. Proceedings of the 39th International Conference on Machine Learning, in Proceedings of Machine Learning Research 162:27671-27692 Available from https://proceedings.mlr.press/v162/zilber22a.html.

Related Material