Fast and Sample Efficient Inductive Matrix Completion via Multi-Phase Procrustes Flow

Xiao Zhang, Simon Du, Quanquan Gu
; Proceedings of the 35th International Conference on Machine Learning, PMLR 80:5756-5765, 2018.

Abstract

We revisit the inductive matrix completion problem that aims to recover a rank-$r$ matrix with ambient dimension $d$ given $n$ features as the side prior information. The goal is to make use of the known $n$ features to reduce sample and computational complexities. We present and analyze a new gradient-based non-convex optimization algorithm that converges to the true underlying matrix at a linear rate with sample complexity only linearly depending on $n$ and logarithmically depending on $d$. To the best of our knowledge, all previous algorithms either have a quadratic dependency on the number of features in sample complexity or a sub-linear computational convergence rate. In addition, we provide experiments on both synthetic and real world data to demonstrate the effectiveness of our proposed algorithm.

Cite this Paper


BibTeX
@InProceedings{pmlr-v80-zhang18b, title = {Fast and Sample Efficient Inductive Matrix Completion via Multi-Phase Procrustes Flow}, author = {Zhang, Xiao and Du, Simon and Gu, Quanquan}, booktitle = {Proceedings of the 35th International Conference on Machine Learning}, pages = {5756--5765}, year = {2018}, editor = {Jennifer Dy and Andreas Krause}, volume = {80}, series = {Proceedings of Machine Learning Research}, address = {Stockholmsmässan, Stockholm Sweden}, month = {10--15 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v80/zhang18b/zhang18b.pdf}, url = {http://proceedings.mlr.press/v80/zhang18b.html}, abstract = {We revisit the inductive matrix completion problem that aims to recover a rank-$r$ matrix with ambient dimension $d$ given $n$ features as the side prior information. The goal is to make use of the known $n$ features to reduce sample and computational complexities. We present and analyze a new gradient-based non-convex optimization algorithm that converges to the true underlying matrix at a linear rate with sample complexity only linearly depending on $n$ and logarithmically depending on $d$. To the best of our knowledge, all previous algorithms either have a quadratic dependency on the number of features in sample complexity or a sub-linear computational convergence rate. In addition, we provide experiments on both synthetic and real world data to demonstrate the effectiveness of our proposed algorithm.} }
Endnote
%0 Conference Paper %T Fast and Sample Efficient Inductive Matrix Completion via Multi-Phase Procrustes Flow %A Xiao Zhang %A Simon Du %A Quanquan Gu %B Proceedings of the 35th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2018 %E Jennifer Dy %E Andreas Krause %F pmlr-v80-zhang18b %I PMLR %J Proceedings of Machine Learning Research %P 5756--5765 %U http://proceedings.mlr.press %V 80 %W PMLR %X We revisit the inductive matrix completion problem that aims to recover a rank-$r$ matrix with ambient dimension $d$ given $n$ features as the side prior information. The goal is to make use of the known $n$ features to reduce sample and computational complexities. We present and analyze a new gradient-based non-convex optimization algorithm that converges to the true underlying matrix at a linear rate with sample complexity only linearly depending on $n$ and logarithmically depending on $d$. To the best of our knowledge, all previous algorithms either have a quadratic dependency on the number of features in sample complexity or a sub-linear computational convergence rate. In addition, we provide experiments on both synthetic and real world data to demonstrate the effectiveness of our proposed algorithm.
APA
Zhang, X., Du, S. & Gu, Q.. (2018). Fast and Sample Efficient Inductive Matrix Completion via Multi-Phase Procrustes Flow. Proceedings of the 35th International Conference on Machine Learning, in PMLR 80:5756-5765

Related Material