Accelerated Stochastic Gradient Descent for Minimizing Finite Sums

[edit]

Atsushi Nitanda ;
Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, PMLR 51:195-203, 2016.

Abstract

We propose an optimization method for minimizing the finite sums of smooth convex functions. Our method incorporates an accelerated gradient descent (AGD) and a stochastic variance reduction gradient (SVRG) in a mini-batch setting. An important feature of the method is that it can be directly applied to general convex and semi-strongly convex problems that is a weaker condition than strong convexity. We show that our method achieves a better overall complexity for the general convex problems and linear convergence for optimal strongly convex problems. Moreover we prove the fast iteration complexity of our method. Our experiments show the effectiveness of our method.

Related Material