Accelerated Stochastic Mirror Descent: From Continuous-time Dynamics to Discrete-time Algorithms

Pan Xu, Tianhao Wang, Quanquan Gu
Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics, PMLR 84:1087-1096, 2018.

Abstract

We present a new framework to analyze accelerated stochastic mirror descent through the lens of continuous-time stochastic dynamic systems. It enables us to design new algorithms, and perform a unified and simple analysis of the convergence rates of these algorithms. More specifically, under this framework, we provide a Lyapunov function based analysis for the continuous-time stochastic dynamics, as well as several new discrete-time algorithms derived from the continuous-time dynamics. We show that for general convex objective functions, the derived discrete-time algorithms attain the optimal convergence rate. Empirical experiments corroborate our theory.

Cite this Paper


BibTeX
@InProceedings{pmlr-v84-xu18e, title = {Accelerated Stochastic Mirror Descent: From Continuous-time Dynamics to Discrete-time Algorithms}, author = {Xu, Pan and Wang, Tianhao and Gu, Quanquan}, booktitle = {Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics}, pages = {1087--1096}, year = {2018}, editor = {Storkey, Amos and Perez-Cruz, Fernando}, volume = {84}, series = {Proceedings of Machine Learning Research}, month = {09--11 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v84/xu18e/xu18e.pdf}, url = {https://proceedings.mlr.press/v84/xu18e.html}, abstract = {We present a new framework to analyze accelerated stochastic mirror descent through the lens of continuous-time stochastic dynamic systems. It enables us to design new algorithms, and perform a unified and simple analysis of the convergence rates of these algorithms. More specifically, under this framework, we provide a Lyapunov function based analysis for the continuous-time stochastic dynamics, as well as several new discrete-time algorithms derived from the continuous-time dynamics. We show that for general convex objective functions, the derived discrete-time algorithms attain the optimal convergence rate. Empirical experiments corroborate our theory.} }
Endnote
%0 Conference Paper %T Accelerated Stochastic Mirror Descent: From Continuous-time Dynamics to Discrete-time Algorithms %A Pan Xu %A Tianhao Wang %A Quanquan Gu %B Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2018 %E Amos Storkey %E Fernando Perez-Cruz %F pmlr-v84-xu18e %I PMLR %P 1087--1096 %U https://proceedings.mlr.press/v84/xu18e.html %V 84 %X We present a new framework to analyze accelerated stochastic mirror descent through the lens of continuous-time stochastic dynamic systems. It enables us to design new algorithms, and perform a unified and simple analysis of the convergence rates of these algorithms. More specifically, under this framework, we provide a Lyapunov function based analysis for the continuous-time stochastic dynamics, as well as several new discrete-time algorithms derived from the continuous-time dynamics. We show that for general convex objective functions, the derived discrete-time algorithms attain the optimal convergence rate. Empirical experiments corroborate our theory.
APA
Xu, P., Wang, T. & Gu, Q.. (2018). Accelerated Stochastic Mirror Descent: From Continuous-time Dynamics to Discrete-time Algorithms. Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 84:1087-1096 Available from https://proceedings.mlr.press/v84/xu18e.html.

Related Material