Conditional Accelerated Lazy Stochastic Gradient Descent

[edit]

Guanghui Lan, Sebastian Pokutta, Yi Zhou, Daniel Zink ;
Proceedings of the 34th International Conference on Machine Learning, PMLR 70:1965-1974, 2017.

Abstract

In this work we introduce a conditional accelerated lazy stochastic gradient descent algorithm with optimal number of calls to a stochastic first-order oracle and convergence rate $O(1/\epsilon^2)$ improving over the projection-free, Online Frank-Wolfe based stochastic gradient descent of (Hazan and Kale, 2012) with convergence rate $O(1/\epsilon^4)$.

Related Material