A Simple Stochastic Variance Reduced Algorithm with Fast Convergence Rates

Kaiwen Zhou, Fanhua Shang, James Cheng
Proceedings of the 35th International Conference on Machine Learning, PMLR 80:5980-5989, 2018.

Abstract

Recent years have witnessed exciting progress in the study of stochastic variance reduced gradient methods (e.g., SVRG, SAGA), their accelerated variants (e.g, Katyusha) and their extensions in many different settings (e.g., online, sparse, asynchronous, distributed). Among them, accelerated methods enjoy improved convergence rates but have complex coupling structures, which makes them hard to be extended to more settings (e.g., sparse and asynchronous) due to the existence of perturbation. In this paper, we introduce a simple stochastic variance reduced algorithm (MiG), which enjoys the best-known convergence rates for both strongly convex and non-strongly convex problems. Moreover, we also present its efficient sparse and asynchronous variants, and theoretically analyze its convergence rates in these settings. Finally, extensive experiments for various machine learning problems such as logistic regression are given to illustrate the practical improvement in both serial and asynchronous settings.

Cite this Paper


BibTeX
@InProceedings{pmlr-v80-zhou18c, title = {A Simple Stochastic Variance Reduced Algorithm with Fast Convergence Rates}, author = {Zhou, Kaiwen and Shang, Fanhua and Cheng, James}, booktitle = {Proceedings of the 35th International Conference on Machine Learning}, pages = {5980--5989}, year = {2018}, editor = {Dy, Jennifer and Krause, Andreas}, volume = {80}, series = {Proceedings of Machine Learning Research}, month = {10--15 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v80/zhou18c/zhou18c.pdf}, url = {http://proceedings.mlr.press/v80/zhou18c.html}, abstract = {Recent years have witnessed exciting progress in the study of stochastic variance reduced gradient methods (e.g., SVRG, SAGA), their accelerated variants (e.g, Katyusha) and their extensions in many different settings (e.g., online, sparse, asynchronous, distributed). Among them, accelerated methods enjoy improved convergence rates but have complex coupling structures, which makes them hard to be extended to more settings (e.g., sparse and asynchronous) due to the existence of perturbation. In this paper, we introduce a simple stochastic variance reduced algorithm (MiG), which enjoys the best-known convergence rates for both strongly convex and non-strongly convex problems. Moreover, we also present its efficient sparse and asynchronous variants, and theoretically analyze its convergence rates in these settings. Finally, extensive experiments for various machine learning problems such as logistic regression are given to illustrate the practical improvement in both serial and asynchronous settings.} }
Endnote
%0 Conference Paper %T A Simple Stochastic Variance Reduced Algorithm with Fast Convergence Rates %A Kaiwen Zhou %A Fanhua Shang %A James Cheng %B Proceedings of the 35th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2018 %E Jennifer Dy %E Andreas Krause %F pmlr-v80-zhou18c %I PMLR %P 5980--5989 %U http://proceedings.mlr.press/v80/zhou18c.html %V 80 %X Recent years have witnessed exciting progress in the study of stochastic variance reduced gradient methods (e.g., SVRG, SAGA), their accelerated variants (e.g, Katyusha) and their extensions in many different settings (e.g., online, sparse, asynchronous, distributed). Among them, accelerated methods enjoy improved convergence rates but have complex coupling structures, which makes them hard to be extended to more settings (e.g., sparse and asynchronous) due to the existence of perturbation. In this paper, we introduce a simple stochastic variance reduced algorithm (MiG), which enjoys the best-known convergence rates for both strongly convex and non-strongly convex problems. Moreover, we also present its efficient sparse and asynchronous variants, and theoretically analyze its convergence rates in these settings. Finally, extensive experiments for various machine learning problems such as logistic regression are given to illustrate the practical improvement in both serial and asynchronous settings.
APA
Zhou, K., Shang, F. & Cheng, J.. (2018). A Simple Stochastic Variance Reduced Algorithm with Fast Convergence Rates. Proceedings of the 35th International Conference on Machine Learning, in Proceedings of Machine Learning Research 80:5980-5989 Available from http://proceedings.mlr.press/v80/zhou18c.html.

Related Material