Accelerated Stochastic Gradient Descent for Minimizing Finite Sums

Atsushi Nitanda
Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, PMLR 51:195-203, 2016.

Abstract

We propose an optimization method for minimizing the finite sums of smooth convex functions. Our method incorporates an accelerated gradient descent (AGD) and a stochastic variance reduction gradient (SVRG) in a mini-batch setting. An important feature of the method is that it can be directly applied to general convex and semi-strongly convex problems that is a weaker condition than strong convexity. We show that our method achieves a better overall complexity for the general convex problems and linear convergence for optimal strongly convex problems. Moreover we prove the fast iteration complexity of our method. Our experiments show the effectiveness of our method.

Cite this Paper


BibTeX
@InProceedings{pmlr-v51-nitanda16, title = {Accelerated Stochastic Gradient Descent for Minimizing Finite Sums}, author = {Nitanda, Atsushi}, booktitle = {Proceedings of the 19th International Conference on Artificial Intelligence and Statistics}, pages = {195--203}, year = {2016}, editor = {Gretton, Arthur and Robert, Christian C.}, volume = {51}, series = {Proceedings of Machine Learning Research}, address = {Cadiz, Spain}, month = {09--11 May}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v51/nitanda16.pdf}, url = {https://proceedings.mlr.press/v51/nitanda16.html}, abstract = {We propose an optimization method for minimizing the finite sums of smooth convex functions. Our method incorporates an accelerated gradient descent (AGD) and a stochastic variance reduction gradient (SVRG) in a mini-batch setting. An important feature of the method is that it can be directly applied to general convex and semi-strongly convex problems that is a weaker condition than strong convexity. We show that our method achieves a better overall complexity for the general convex problems and linear convergence for optimal strongly convex problems. Moreover we prove the fast iteration complexity of our method. Our experiments show the effectiveness of our method.} }
Endnote
%0 Conference Paper %T Accelerated Stochastic Gradient Descent for Minimizing Finite Sums %A Atsushi Nitanda %B Proceedings of the 19th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2016 %E Arthur Gretton %E Christian C. Robert %F pmlr-v51-nitanda16 %I PMLR %P 195--203 %U https://proceedings.mlr.press/v51/nitanda16.html %V 51 %X We propose an optimization method for minimizing the finite sums of smooth convex functions. Our method incorporates an accelerated gradient descent (AGD) and a stochastic variance reduction gradient (SVRG) in a mini-batch setting. An important feature of the method is that it can be directly applied to general convex and semi-strongly convex problems that is a weaker condition than strong convexity. We show that our method achieves a better overall complexity for the general convex problems and linear convergence for optimal strongly convex problems. Moreover we prove the fast iteration complexity of our method. Our experiments show the effectiveness of our method.
RIS
TY - CPAPER TI - Accelerated Stochastic Gradient Descent for Minimizing Finite Sums AU - Atsushi Nitanda BT - Proceedings of the 19th International Conference on Artificial Intelligence and Statistics DA - 2016/05/02 ED - Arthur Gretton ED - Christian C. Robert ID - pmlr-v51-nitanda16 PB - PMLR DP - Proceedings of Machine Learning Research VL - 51 SP - 195 EP - 203 L1 - http://proceedings.mlr.press/v51/nitanda16.pdf UR - https://proceedings.mlr.press/v51/nitanda16.html AB - We propose an optimization method for minimizing the finite sums of smooth convex functions. Our method incorporates an accelerated gradient descent (AGD) and a stochastic variance reduction gradient (SVRG) in a mini-batch setting. An important feature of the method is that it can be directly applied to general convex and semi-strongly convex problems that is a weaker condition than strong convexity. We show that our method achieves a better overall complexity for the general convex problems and linear convergence for optimal strongly convex problems. Moreover we prove the fast iteration complexity of our method. Our experiments show the effectiveness of our method. ER -
APA
Nitanda, A.. (2016). Accelerated Stochastic Gradient Descent for Minimizing Finite Sums. Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 51:195-203 Available from https://proceedings.mlr.press/v51/nitanda16.html.

Related Material