Efficient Low-Rank Stochastic Gradient Descent Methods for Solving Semidefinite Programs

Jianhui Chen, Tianbao Yang, Shenghuo Zhu
Proceedings of the Seventeenth International Conference on Artificial Intelligence and Statistics, PMLR 33:122-130, 2014.

Abstract

We propose a low-rank stochastic gradient descent (LR-SGD) method for solving a class of semidefinite programming (SDP) problems. LR-SGD has clear computational advantages over the standard SGD peers as its iterative projection step (a SDP problem) can be solved in an efficient manner. Specifically, LR-SGD constructs a low-rank stochastic gradient and computes an optimal solution to the projection step via analyzing the low-rank structure of its stochastic gradient. Moreover, our theoretical analysis shows the universal existence of arbitrary low-rank stochastic gradients which in turn validates the rationale of the LR-SGD method. Since LR-SGD is a SGD based method, it achieves the optimal convergence rates of the standard SGD methods. The presented experimental results demonstrate the efficiency and effectiveness of the LR-SGD method.

Cite this Paper


BibTeX
@InProceedings{pmlr-v33-chen14b, title = {{Efficient Low-Rank Stochastic Gradient Descent Methods for Solving Semidefinite Programs}}, author = {Chen, Jianhui and Yang, Tianbao and Zhu, Shenghuo}, booktitle = {Proceedings of the Seventeenth International Conference on Artificial Intelligence and Statistics}, pages = {122--130}, year = {2014}, editor = {Kaski, Samuel and Corander, Jukka}, volume = {33}, series = {Proceedings of Machine Learning Research}, address = {Reykjavik, Iceland}, month = {22--25 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v33/chen14b.pdf}, url = {https://proceedings.mlr.press/v33/chen14b.html}, abstract = {We propose a low-rank stochastic gradient descent (LR-SGD) method for solving a class of semidefinite programming (SDP) problems. LR-SGD has clear computational advantages over the standard SGD peers as its iterative projection step (a SDP problem) can be solved in an efficient manner. Specifically, LR-SGD constructs a low-rank stochastic gradient and computes an optimal solution to the projection step via analyzing the low-rank structure of its stochastic gradient. Moreover, our theoretical analysis shows the universal existence of arbitrary low-rank stochastic gradients which in turn validates the rationale of the LR-SGD method. Since LR-SGD is a SGD based method, it achieves the optimal convergence rates of the standard SGD methods. The presented experimental results demonstrate the efficiency and effectiveness of the LR-SGD method.} }
Endnote
%0 Conference Paper %T Efficient Low-Rank Stochastic Gradient Descent Methods for Solving Semidefinite Programs %A Jianhui Chen %A Tianbao Yang %A Shenghuo Zhu %B Proceedings of the Seventeenth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2014 %E Samuel Kaski %E Jukka Corander %F pmlr-v33-chen14b %I PMLR %P 122--130 %U https://proceedings.mlr.press/v33/chen14b.html %V 33 %X We propose a low-rank stochastic gradient descent (LR-SGD) method for solving a class of semidefinite programming (SDP) problems. LR-SGD has clear computational advantages over the standard SGD peers as its iterative projection step (a SDP problem) can be solved in an efficient manner. Specifically, LR-SGD constructs a low-rank stochastic gradient and computes an optimal solution to the projection step via analyzing the low-rank structure of its stochastic gradient. Moreover, our theoretical analysis shows the universal existence of arbitrary low-rank stochastic gradients which in turn validates the rationale of the LR-SGD method. Since LR-SGD is a SGD based method, it achieves the optimal convergence rates of the standard SGD methods. The presented experimental results demonstrate the efficiency and effectiveness of the LR-SGD method.
RIS
TY - CPAPER TI - Efficient Low-Rank Stochastic Gradient Descent Methods for Solving Semidefinite Programs AU - Jianhui Chen AU - Tianbao Yang AU - Shenghuo Zhu BT - Proceedings of the Seventeenth International Conference on Artificial Intelligence and Statistics DA - 2014/04/02 ED - Samuel Kaski ED - Jukka Corander ID - pmlr-v33-chen14b PB - PMLR DP - Proceedings of Machine Learning Research VL - 33 SP - 122 EP - 130 L1 - http://proceedings.mlr.press/v33/chen14b.pdf UR - https://proceedings.mlr.press/v33/chen14b.html AB - We propose a low-rank stochastic gradient descent (LR-SGD) method for solving a class of semidefinite programming (SDP) problems. LR-SGD has clear computational advantages over the standard SGD peers as its iterative projection step (a SDP problem) can be solved in an efficient manner. Specifically, LR-SGD constructs a low-rank stochastic gradient and computes an optimal solution to the projection step via analyzing the low-rank structure of its stochastic gradient. Moreover, our theoretical analysis shows the universal existence of arbitrary low-rank stochastic gradients which in turn validates the rationale of the LR-SGD method. Since LR-SGD is a SGD based method, it achieves the optimal convergence rates of the standard SGD methods. The presented experimental results demonstrate the efficiency and effectiveness of the LR-SGD method. ER -
APA
Chen, J., Yang, T. & Zhu, S.. (2014). Efficient Low-Rank Stochastic Gradient Descent Methods for Solving Semidefinite Programs. Proceedings of the Seventeenth International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 33:122-130 Available from https://proceedings.mlr.press/v33/chen14b.html.

Related Material