Characterization of Convex Objective Functions and Optimal Expected Convergence Rates for SGD

Marten Van Dijk, Lam Nguyen, Phuong Ha Nguyen, Dzung Phan
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:6392-6400, 2019.

Abstract

We study Stochastic Gradient Descent (SGD) with diminishing step sizes for convex objective functions. We introduce a definitional framework and theory that defines and characterizes a core property, called curvature, of convex objective functions. In terms of curvature we can derive a new inequality that can be used to compute an optimal sequence of diminishing step sizes by solving a differential equation. Our exact solutions confirm known results in literature and allows us to fully characterize a new regularizer with its corresponding expected convergence rates.

Cite this Paper


BibTeX
@InProceedings{pmlr-v97-van-dijk19a, title = {Characterization of Convex Objective Functions and Optimal Expected Convergence Rates for {SGD}}, author = {Van Dijk, Marten and Nguyen, Lam and Nguyen, Phuong Ha and Phan, Dzung}, booktitle = {Proceedings of the 36th International Conference on Machine Learning}, pages = {6392--6400}, year = {2019}, editor = {Chaudhuri, Kamalika and Salakhutdinov, Ruslan}, volume = {97}, series = {Proceedings of Machine Learning Research}, month = {09--15 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v97/van-dijk19a/van-dijk19a.pdf}, url = {https://proceedings.mlr.press/v97/van-dijk19a.html}, abstract = {We study Stochastic Gradient Descent (SGD) with diminishing step sizes for convex objective functions. We introduce a definitional framework and theory that defines and characterizes a core property, called curvature, of convex objective functions. In terms of curvature we can derive a new inequality that can be used to compute an optimal sequence of diminishing step sizes by solving a differential equation. Our exact solutions confirm known results in literature and allows us to fully characterize a new regularizer with its corresponding expected convergence rates.} }
Endnote
%0 Conference Paper %T Characterization of Convex Objective Functions and Optimal Expected Convergence Rates for SGD %A Marten Van Dijk %A Lam Nguyen %A Phuong Ha Nguyen %A Dzung Phan %B Proceedings of the 36th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Ruslan Salakhutdinov %F pmlr-v97-van-dijk19a %I PMLR %P 6392--6400 %U https://proceedings.mlr.press/v97/van-dijk19a.html %V 97 %X We study Stochastic Gradient Descent (SGD) with diminishing step sizes for convex objective functions. We introduce a definitional framework and theory that defines and characterizes a core property, called curvature, of convex objective functions. In terms of curvature we can derive a new inequality that can be used to compute an optimal sequence of diminishing step sizes by solving a differential equation. Our exact solutions confirm known results in literature and allows us to fully characterize a new regularizer with its corresponding expected convergence rates.
APA
Van Dijk, M., Nguyen, L., Nguyen, P.H. & Phan, D.. (2019). Characterization of Convex Objective Functions and Optimal Expected Convergence Rates for SGD. Proceedings of the 36th International Conference on Machine Learning, in Proceedings of Machine Learning Research 97:6392-6400 Available from https://proceedings.mlr.press/v97/van-dijk19a.html.

Related Material