Provable Alternating Gradient Descent for Non-negative Matrix Factorization with Strong Correlations

Yuanzhi Li, Yingyu Liang
; Proceedings of the 34th International Conference on Machine Learning, PMLR 70:2062-2070, 2017.

Abstract

Non-negative matrix factorization is a basic tool for decomposing data into the feature and weight matrices under non-negativity constraints, and in practice is often solved in the alternating minimization framework. However, it is unclear whether such algorithms can recover the ground-truth feature matrix when the weights for different features are highly correlated, which is common in applications. This paper proposes a simple and natural alternating gradient descent based algorithm, and shows that with a mild initialization it provably recovers the ground-truth in the presence of strong correlations. In most interesting cases, the correlation can be in the same order as the highest possible. Our analysis also reveals its several favorable features including robustness to noise. We complement our theoretical results with empirical studies on semi-synthetic datasets, demonstrating its advantage over several popular methods in recovering the ground-truth.

Cite this Paper


BibTeX
@InProceedings{pmlr-v70-li17b, title = {Provable Alternating Gradient Descent for Non-negative Matrix Factorization with Strong Correlations}, author = {Yuanzhi Li and Yingyu Liang}, booktitle = {Proceedings of the 34th International Conference on Machine Learning}, pages = {2062--2070}, year = {2017}, editor = {Doina Precup and Yee Whye Teh}, volume = {70}, series = {Proceedings of Machine Learning Research}, address = {International Convention Centre, Sydney, Australia}, month = {06--11 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v70/li17b/li17b.pdf}, url = {http://proceedings.mlr.press/v70/li17b.html}, abstract = {Non-negative matrix factorization is a basic tool for decomposing data into the feature and weight matrices under non-negativity constraints, and in practice is often solved in the alternating minimization framework. However, it is unclear whether such algorithms can recover the ground-truth feature matrix when the weights for different features are highly correlated, which is common in applications. This paper proposes a simple and natural alternating gradient descent based algorithm, and shows that with a mild initialization it provably recovers the ground-truth in the presence of strong correlations. In most interesting cases, the correlation can be in the same order as the highest possible. Our analysis also reveals its several favorable features including robustness to noise. We complement our theoretical results with empirical studies on semi-synthetic datasets, demonstrating its advantage over several popular methods in recovering the ground-truth.} }
Endnote
%0 Conference Paper %T Provable Alternating Gradient Descent for Non-negative Matrix Factorization with Strong Correlations %A Yuanzhi Li %A Yingyu Liang %B Proceedings of the 34th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2017 %E Doina Precup %E Yee Whye Teh %F pmlr-v70-li17b %I PMLR %J Proceedings of Machine Learning Research %P 2062--2070 %U http://proceedings.mlr.press %V 70 %W PMLR %X Non-negative matrix factorization is a basic tool for decomposing data into the feature and weight matrices under non-negativity constraints, and in practice is often solved in the alternating minimization framework. However, it is unclear whether such algorithms can recover the ground-truth feature matrix when the weights for different features are highly correlated, which is common in applications. This paper proposes a simple and natural alternating gradient descent based algorithm, and shows that with a mild initialization it provably recovers the ground-truth in the presence of strong correlations. In most interesting cases, the correlation can be in the same order as the highest possible. Our analysis also reveals its several favorable features including robustness to noise. We complement our theoretical results with empirical studies on semi-synthetic datasets, demonstrating its advantage over several popular methods in recovering the ground-truth.
APA
Li, Y. & Liang, Y.. (2017). Provable Alternating Gradient Descent for Non-negative Matrix Factorization with Strong Correlations. Proceedings of the 34th International Conference on Machine Learning, in PMLR 70:2062-2070

Related Material