Noisy Low-rank Matrix Optimization: Geometry of Local Minima and Convergence Rate

Ziye Ma, Somayeh Sojoudi
Proceedings of The 26th International Conference on Artificial Intelligence and Statistics, PMLR 206:3125-3150, 2023.

Abstract

This paper is concerned with low-rank matrix optimization, which has found a wide range of applications in machine learning. This problem in the special case of matrix sensing has been studied extensively through the notion of Restricted Isometry Property (RIP), leading to a wealth of results on the geometric landscape of the problem and the convergence rate of common algorithms. However, the existing results can handle the problem in the case with a general objective function subject to noisy data only when the RIP constant is close to 0. In this paper, we develop a new mathematical framework to solve the above-mentioned problem with a far less restrictive RIP constant. We prove that as long as the RIP constant of the noiseless objective is less than 1/3, any spurious local solution of the noisy optimization problem must be close to the ground truth solution. By working through the strict saddle property, we also show that an approximate solution can be found in polynomial time. We characterize the geometry of the spurious local minima of the problem in a local region around the ground truth in the case when the RIP constant is greater than 1/3. Compared to the existing results in the literature, this paper offers the strongest RIP bound, and provides a complete theoretical analysis on the global and local optimization landscapes of general low-rank optimization problems under random corruptions from any finite-variance family.

Cite this Paper


BibTeX
@InProceedings{pmlr-v206-ma23a, title = {Noisy Low-rank Matrix Optimization: Geometry of Local Minima and Convergence Rate}, author = {Ma, Ziye and Sojoudi, Somayeh}, booktitle = {Proceedings of The 26th International Conference on Artificial Intelligence and Statistics}, pages = {3125--3150}, year = {2023}, editor = {Ruiz, Francisco and Dy, Jennifer and van de Meent, Jan-Willem}, volume = {206}, series = {Proceedings of Machine Learning Research}, month = {25--27 Apr}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v206/ma23a/ma23a.pdf}, url = {https://proceedings.mlr.press/v206/ma23a.html}, abstract = {This paper is concerned with low-rank matrix optimization, which has found a wide range of applications in machine learning. This problem in the special case of matrix sensing has been studied extensively through the notion of Restricted Isometry Property (RIP), leading to a wealth of results on the geometric landscape of the problem and the convergence rate of common algorithms. However, the existing results can handle the problem in the case with a general objective function subject to noisy data only when the RIP constant is close to 0. In this paper, we develop a new mathematical framework to solve the above-mentioned problem with a far less restrictive RIP constant. We prove that as long as the RIP constant of the noiseless objective is less than 1/3, any spurious local solution of the noisy optimization problem must be close to the ground truth solution. By working through the strict saddle property, we also show that an approximate solution can be found in polynomial time. We characterize the geometry of the spurious local minima of the problem in a local region around the ground truth in the case when the RIP constant is greater than 1/3. Compared to the existing results in the literature, this paper offers the strongest RIP bound, and provides a complete theoretical analysis on the global and local optimization landscapes of general low-rank optimization problems under random corruptions from any finite-variance family.} }
Endnote
%0 Conference Paper %T Noisy Low-rank Matrix Optimization: Geometry of Local Minima and Convergence Rate %A Ziye Ma %A Somayeh Sojoudi %B Proceedings of The 26th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2023 %E Francisco Ruiz %E Jennifer Dy %E Jan-Willem van de Meent %F pmlr-v206-ma23a %I PMLR %P 3125--3150 %U https://proceedings.mlr.press/v206/ma23a.html %V 206 %X This paper is concerned with low-rank matrix optimization, which has found a wide range of applications in machine learning. This problem in the special case of matrix sensing has been studied extensively through the notion of Restricted Isometry Property (RIP), leading to a wealth of results on the geometric landscape of the problem and the convergence rate of common algorithms. However, the existing results can handle the problem in the case with a general objective function subject to noisy data only when the RIP constant is close to 0. In this paper, we develop a new mathematical framework to solve the above-mentioned problem with a far less restrictive RIP constant. We prove that as long as the RIP constant of the noiseless objective is less than 1/3, any spurious local solution of the noisy optimization problem must be close to the ground truth solution. By working through the strict saddle property, we also show that an approximate solution can be found in polynomial time. We characterize the geometry of the spurious local minima of the problem in a local region around the ground truth in the case when the RIP constant is greater than 1/3. Compared to the existing results in the literature, this paper offers the strongest RIP bound, and provides a complete theoretical analysis on the global and local optimization landscapes of general low-rank optimization problems under random corruptions from any finite-variance family.
APA
Ma, Z. & Sojoudi, S.. (2023). Noisy Low-rank Matrix Optimization: Geometry of Local Minima and Convergence Rate. Proceedings of The 26th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 206:3125-3150 Available from https://proceedings.mlr.press/v206/ma23a.html.

Related Material