Simple Stochastic Gradient Methods for Non-Smooth Non-Convex Regularized Optimization

Michael Metel, Akiko Takeda
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:4537-4545, 2019.

Abstract

Our work focuses on stochastic gradient methods for optimizing a smooth non-convex loss function with a non-smooth non-convex regularizer. Research on this class of problem is quite limited, and until recently no non-asymptotic convergence results have been reported. We present two simple stochastic gradient algorithms, for finite-sum and general stochastic optimization problems, which have superior convergence complexities compared to the current state-of-the-art. We also compare our algorithms’ performance in practice for empirical risk minimization.

Cite this Paper


BibTeX
@InProceedings{pmlr-v97-metel19a, title = {Simple Stochastic Gradient Methods for Non-Smooth Non-Convex Regularized Optimization}, author = {Metel, Michael and Takeda, Akiko}, booktitle = {Proceedings of the 36th International Conference on Machine Learning}, pages = {4537--4545}, year = {2019}, editor = {Chaudhuri, Kamalika and Salakhutdinov, Ruslan}, volume = {97}, series = {Proceedings of Machine Learning Research}, month = {09--15 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v97/metel19a/metel19a.pdf}, url = {https://proceedings.mlr.press/v97/metel19a.html}, abstract = {Our work focuses on stochastic gradient methods for optimizing a smooth non-convex loss function with a non-smooth non-convex regularizer. Research on this class of problem is quite limited, and until recently no non-asymptotic convergence results have been reported. We present two simple stochastic gradient algorithms, for finite-sum and general stochastic optimization problems, which have superior convergence complexities compared to the current state-of-the-art. We also compare our algorithms’ performance in practice for empirical risk minimization.} }
Endnote
%0 Conference Paper %T Simple Stochastic Gradient Methods for Non-Smooth Non-Convex Regularized Optimization %A Michael Metel %A Akiko Takeda %B Proceedings of the 36th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Ruslan Salakhutdinov %F pmlr-v97-metel19a %I PMLR %P 4537--4545 %U https://proceedings.mlr.press/v97/metel19a.html %V 97 %X Our work focuses on stochastic gradient methods for optimizing a smooth non-convex loss function with a non-smooth non-convex regularizer. Research on this class of problem is quite limited, and until recently no non-asymptotic convergence results have been reported. We present two simple stochastic gradient algorithms, for finite-sum and general stochastic optimization problems, which have superior convergence complexities compared to the current state-of-the-art. We also compare our algorithms’ performance in practice for empirical risk minimization.
APA
Metel, M. & Takeda, A.. (2019). Simple Stochastic Gradient Methods for Non-Smooth Non-Convex Regularized Optimization. Proceedings of the 36th International Conference on Machine Learning, in Proceedings of Machine Learning Research 97:4537-4545 Available from https://proceedings.mlr.press/v97/metel19a.html.

Related Material