Conditional Accelerated Lazy Stochastic Gradient Descent
[edit]
Proceedings of the 34th International Conference on Machine Learning, PMLR 70:19651974, 2017.
Abstract
In this work we introduce a conditional accelerated lazy stochastic gradient descent algorithm with optimal number of calls to a stochastic firstorder oracle and convergence rate $O(1/\epsilon^2)$ improving over the projectionfree, Online FrankWolfe based stochastic gradient descent of (Hazan and Kale, 2012) with convergence rate $O(1/\epsilon^4)$.
Related Material


