Modeling simple structures and geometry for better stochastic optimization algorithms

Hilal Asi, John C. Duchi
Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics, PMLR 89:2425-2434, 2019.

Abstract

We develop model-based methods for stochastic optimization problems, introducing the approximate-proximal point, or aProx, family, which includes stochastic subgradient, proximal point, and bundle methods. For appropriately accurate models, the methods enjoy stronger convergence and robustness guarantees than classical approaches and typically add little to no computational overhead over stochastic subgradient methods. For example, we show that methods using the improved models converge with probability 1; these methods are also adaptive to a natural class of what we term easy optimization problems, achieving linear convergence under appropriate strong growth conditions on the objective. Our experimental investigation shows the advantages of more accurate modeling over standard subgradient methods across many smooth and non-smooth, convex and non-convex optimization problems.

Cite this Paper


BibTeX
@InProceedings{pmlr-v89-asi19a, title = {Modeling simple structures and geometry for better stochastic optimization algorithms}, author = {Asi, Hilal and Duchi, John C.}, booktitle = {Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics}, pages = {2425--2434}, year = {2019}, editor = {Chaudhuri, Kamalika and Sugiyama, Masashi}, volume = {89}, series = {Proceedings of Machine Learning Research}, month = {16--18 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v89/asi19a/asi19a.pdf}, url = {https://proceedings.mlr.press/v89/asi19a.html}, abstract = {We develop model-based methods for stochastic optimization problems, introducing the approximate-proximal point, or aProx, family, which includes stochastic subgradient, proximal point, and bundle methods. For appropriately accurate models, the methods enjoy stronger convergence and robustness guarantees than classical approaches and typically add little to no computational overhead over stochastic subgradient methods. For example, we show that methods using the improved models converge with probability 1; these methods are also adaptive to a natural class of what we term easy optimization problems, achieving linear convergence under appropriate strong growth conditions on the objective. Our experimental investigation shows the advantages of more accurate modeling over standard subgradient methods across many smooth and non-smooth, convex and non-convex optimization problems.} }
Endnote
%0 Conference Paper %T Modeling simple structures and geometry for better stochastic optimization algorithms %A Hilal Asi %A John C. Duchi %B Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Masashi Sugiyama %F pmlr-v89-asi19a %I PMLR %P 2425--2434 %U https://proceedings.mlr.press/v89/asi19a.html %V 89 %X We develop model-based methods for stochastic optimization problems, introducing the approximate-proximal point, or aProx, family, which includes stochastic subgradient, proximal point, and bundle methods. For appropriately accurate models, the methods enjoy stronger convergence and robustness guarantees than classical approaches and typically add little to no computational overhead over stochastic subgradient methods. For example, we show that methods using the improved models converge with probability 1; these methods are also adaptive to a natural class of what we term easy optimization problems, achieving linear convergence under appropriate strong growth conditions on the objective. Our experimental investigation shows the advantages of more accurate modeling over standard subgradient methods across many smooth and non-smooth, convex and non-convex optimization problems.
APA
Asi, H. & Duchi, J.C.. (2019). Modeling simple structures and geometry for better stochastic optimization algorithms. Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 89:2425-2434 Available from https://proceedings.mlr.press/v89/asi19a.html.

Related Material