A Lyapunov analysis for accelerated gradient methods: from deterministic to stochastic case

Maxime Laborde, Adam Oberman
Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, PMLR 108:602-612, 2020.

Abstract

Recent work by Su, Boyd and Candes made a connection between Nesterov’s accelerated gradient descent method and an ordinary differential equation (ODE). We show that this connection can be extended to the case of stochastic gradients, and develop Lyapunov function based convergence rates proof for Nesterov’s accelerated stochastic gradient descent. In the gradient case, we show Nesterov’s method arises as a straightforward discretization of a modified ODE. Established Lyapunov analysis is used to recover the accelerated rates of convergence in both continuous and discrete time. Moreover, the Lyapunov analysis can be extended to the case of stochastic gradients. The result is a unified approach to accelerationin both continuous and discrete time, and in for both stochastic and full gradients.

Cite this Paper


BibTeX
@InProceedings{pmlr-v108-laborde20a, title = {A Lyapunov analysis for accelerated gradient methods: from deterministic to stochastic case}, author = {Laborde, Maxime and Oberman, Adam}, booktitle = {Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics}, pages = {602--612}, year = {2020}, editor = {Chiappa, Silvia and Calandra, Roberto}, volume = {108}, series = {Proceedings of Machine Learning Research}, month = {26--28 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v108/laborde20a/laborde20a.pdf}, url = {https://proceedings.mlr.press/v108/laborde20a.html}, abstract = {Recent work by Su, Boyd and Candes made a connection between Nesterov’s accelerated gradient descent method and an ordinary differential equation (ODE). We show that this connection can be extended to the case of stochastic gradients, and develop Lyapunov function based convergence rates proof for Nesterov’s accelerated stochastic gradient descent. In the gradient case, we show Nesterov’s method arises as a straightforward discretization of a modified ODE. Established Lyapunov analysis is used to recover the accelerated rates of convergence in both continuous and discrete time. Moreover, the Lyapunov analysis can be extended to the case of stochastic gradients. The result is a unified approach to accelerationin both continuous and discrete time, and in for both stochastic and full gradients.} }
Endnote
%0 Conference Paper %T A Lyapunov analysis for accelerated gradient methods: from deterministic to stochastic case %A Maxime Laborde %A Adam Oberman %B Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2020 %E Silvia Chiappa %E Roberto Calandra %F pmlr-v108-laborde20a %I PMLR %P 602--612 %U https://proceedings.mlr.press/v108/laborde20a.html %V 108 %X Recent work by Su, Boyd and Candes made a connection between Nesterov’s accelerated gradient descent method and an ordinary differential equation (ODE). We show that this connection can be extended to the case of stochastic gradients, and develop Lyapunov function based convergence rates proof for Nesterov’s accelerated stochastic gradient descent. In the gradient case, we show Nesterov’s method arises as a straightforward discretization of a modified ODE. Established Lyapunov analysis is used to recover the accelerated rates of convergence in both continuous and discrete time. Moreover, the Lyapunov analysis can be extended to the case of stochastic gradients. The result is a unified approach to accelerationin both continuous and discrete time, and in for both stochastic and full gradients.
APA
Laborde, M. & Oberman, A.. (2020). A Lyapunov analysis for accelerated gradient methods: from deterministic to stochastic case. Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 108:602-612 Available from https://proceedings.mlr.press/v108/laborde20a.html.

Related Material