Convergence of a Stochastic Gradient Method with Momentum for Non-Smooth Non-Convex Optimization

Vien Mai, Mikael Johansson
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:6630-6639, 2020.

Abstract

Stochastic gradient methods with momentum are widely used in applications and at the core of optimization subroutines in many popular machine learning libraries. However, their sample complexities have not been obtained for problems beyond those that are convex or smooth. This paper establishes the convergence rate of a stochastic subgradient method with a momentum term of Polyak type for a broad class of non-smooth, non-convex, and constrained optimization problems. Our key innovation is the construction of a special Lyapunov function for which the proven complexity can be achieved without any tuning of the momentum parameter. For smooth problems, we extend the known complexity bound to the constrained case and demonstrate how the unconstrained case can be analyzed under weaker assumptions than the state-of-the-art. Numerical results confirm our theoretical developments.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-mai20b, title = {Convergence of a Stochastic Gradient Method with Momentum for Non-Smooth Non-Convex Optimization}, author = {Mai, Vien and Johansson, Mikael}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {6630--6639}, year = {2020}, editor = {III, Hal Daumé and Singh, Aarti}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/mai20b/mai20b.pdf}, url = {https://proceedings.mlr.press/v119/mai20b.html}, abstract = {Stochastic gradient methods with momentum are widely used in applications and at the core of optimization subroutines in many popular machine learning libraries. However, their sample complexities have not been obtained for problems beyond those that are convex or smooth. This paper establishes the convergence rate of a stochastic subgradient method with a momentum term of Polyak type for a broad class of non-smooth, non-convex, and constrained optimization problems. Our key innovation is the construction of a special Lyapunov function for which the proven complexity can be achieved without any tuning of the momentum parameter. For smooth problems, we extend the known complexity bound to the constrained case and demonstrate how the unconstrained case can be analyzed under weaker assumptions than the state-of-the-art. Numerical results confirm our theoretical developments.} }
Endnote
%0 Conference Paper %T Convergence of a Stochastic Gradient Method with Momentum for Non-Smooth Non-Convex Optimization %A Vien Mai %A Mikael Johansson %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-mai20b %I PMLR %P 6630--6639 %U https://proceedings.mlr.press/v119/mai20b.html %V 119 %X Stochastic gradient methods with momentum are widely used in applications and at the core of optimization subroutines in many popular machine learning libraries. However, their sample complexities have not been obtained for problems beyond those that are convex or smooth. This paper establishes the convergence rate of a stochastic subgradient method with a momentum term of Polyak type for a broad class of non-smooth, non-convex, and constrained optimization problems. Our key innovation is the construction of a special Lyapunov function for which the proven complexity can be achieved without any tuning of the momentum parameter. For smooth problems, we extend the known complexity bound to the constrained case and demonstrate how the unconstrained case can be analyzed under weaker assumptions than the state-of-the-art. Numerical results confirm our theoretical developments.
APA
Mai, V. & Johansson, M.. (2020). Convergence of a Stochastic Gradient Method with Momentum for Non-Smooth Non-Convex Optimization. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:6630-6639 Available from https://proceedings.mlr.press/v119/mai20b.html.

Related Material