Efficient Low-Rank Stochastic Gradient Descent Methods for Solving Semidefinite Programs

[edit]

Jianhui Chen, Tianbao Yang, Shenghuo Zhu ;
Proceedings of the Seventeenth International Conference on Artificial Intelligence and Statistics, PMLR 33:122-130, 2014.

Abstract

We propose a low-rank stochastic gradient descent (LR-SGD) method for solving a class of semidefinite programming (SDP) problems. LR-SGD has clear computational advantages over the standard SGD peers as its iterative projection step (a SDP problem) can be solved in an efficient manner. Specifically, LR-SGD constructs a low-rank stochastic gradient and computes an optimal solution to the projection step via analyzing the low-rank structure of its stochastic gradient. Moreover, our theoretical analysis shows the universal existence of arbitrary low-rank stochastic gradients which in turn validates the rationale of the LR-SGD method. Since LR-SGD is a SGD based method, it achieves the optimal convergence rates of the standard SGD methods. The presented experimental results demonstrate the efficiency and effectiveness of the LR-SGD method.

Related Material