No Spurious Local Minima in Nonconvex Low Rank Problems: A Unified Geometric Analysis

Rong Ge, Chi Jin, Yi Zheng
Proceedings of the 34th International Conference on Machine Learning, PMLR 70:1233-1242, 2017.

Abstract

In this paper we develop a new framework that captures the common landscape underlying the common non-convex low-rank matrix problems including matrix sensing, matrix completion and robust PCA. In particular, we show for all above problems (including asymmetric cases): 1) all local minima are also globally optimal; 2) no high-order saddle points exists. These results explain why simple algorithms such as stochastic gradient descent have global converge, and efficiently optimize these non-convex objective functions in practice. Our framework connects and simplifies the existing analyses on optimization landscapes for matrix sensing and symmetric matrix completion. The framework naturally leads to new results for asymmetric matrix completion and robust PCA.

Cite this Paper


BibTeX
@InProceedings{pmlr-v70-ge17a, title = {No Spurious Local Minima in Nonconvex Low Rank Problems: A Unified Geometric Analysis}, author = {Rong Ge and Chi Jin and Yi Zheng}, booktitle = {Proceedings of the 34th International Conference on Machine Learning}, pages = {1233--1242}, year = {2017}, editor = {Precup, Doina and Teh, Yee Whye}, volume = {70}, series = {Proceedings of Machine Learning Research}, month = {06--11 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v70/ge17a/ge17a.pdf}, url = {https://proceedings.mlr.press/v70/ge17a.html}, abstract = {In this paper we develop a new framework that captures the common landscape underlying the common non-convex low-rank matrix problems including matrix sensing, matrix completion and robust PCA. In particular, we show for all above problems (including asymmetric cases): 1) all local minima are also globally optimal; 2) no high-order saddle points exists. These results explain why simple algorithms such as stochastic gradient descent have global converge, and efficiently optimize these non-convex objective functions in practice. Our framework connects and simplifies the existing analyses on optimization landscapes for matrix sensing and symmetric matrix completion. The framework naturally leads to new results for asymmetric matrix completion and robust PCA.} }
Endnote
%0 Conference Paper %T No Spurious Local Minima in Nonconvex Low Rank Problems: A Unified Geometric Analysis %A Rong Ge %A Chi Jin %A Yi Zheng %B Proceedings of the 34th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2017 %E Doina Precup %E Yee Whye Teh %F pmlr-v70-ge17a %I PMLR %P 1233--1242 %U https://proceedings.mlr.press/v70/ge17a.html %V 70 %X In this paper we develop a new framework that captures the common landscape underlying the common non-convex low-rank matrix problems including matrix sensing, matrix completion and robust PCA. In particular, we show for all above problems (including asymmetric cases): 1) all local minima are also globally optimal; 2) no high-order saddle points exists. These results explain why simple algorithms such as stochastic gradient descent have global converge, and efficiently optimize these non-convex objective functions in practice. Our framework connects and simplifies the existing analyses on optimization landscapes for matrix sensing and symmetric matrix completion. The framework naturally leads to new results for asymmetric matrix completion and robust PCA.
APA
Ge, R., Jin, C. & Zheng, Y.. (2017). No Spurious Local Minima in Nonconvex Low Rank Problems: A Unified Geometric Analysis. Proceedings of the 34th International Conference on Machine Learning, in Proceedings of Machine Learning Research 70:1233-1242 Available from https://proceedings.mlr.press/v70/ge17a.html.

Related Material