Stochastic first-order methods: non-asymptotic and computer-aided analyses via potential functions

Adrien Taylor, Francis Bach
Proceedings of the Thirty-Second Conference on Learning Theory, PMLR 99:2934-2992, 2019.

Abstract

We provide a novel computer-assisted technique for systematically analyzing first-order methods for optimization. In contrast with previous works, the approach is particularly suited for handling sublinear convergence rates and stochastic oracles. The technique relies on semidefinite programming and potential functions. It allows simultaneously obtaining worst-case guarantees on the behavior of those algorithms, and assisting in choosing appropriate parameters for tuning their worst-case performances. The technique also benefits from comfortable tightness guarantees, meaning that unsatisfactory results can be improved only by changing the setting. We use the approach for analyzing deterministic and stochastic first-order methods under different assumptions on the nature of the stochastic noise. Among others, we treat unstructured noise with bounded variance, different noise models arising in over-parametrized expectation minimization problems, and randomized block-coordinate descent schemes.

Cite this Paper


BibTeX
@InProceedings{pmlr-v99-taylor19a, title = {Stochastic first-order methods: non-asymptotic and computer-aided analyses via potential functions}, author = {Taylor, Adrien and Bach, Francis}, booktitle = {Proceedings of the Thirty-Second Conference on Learning Theory}, pages = {2934--2992}, year = {2019}, editor = {Beygelzimer, Alina and Hsu, Daniel}, volume = {99}, series = {Proceedings of Machine Learning Research}, month = {25--28 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v99/taylor19a/taylor19a.pdf}, url = {https://proceedings.mlr.press/v99/taylor19a.html}, abstract = {We provide a novel computer-assisted technique for systematically analyzing first-order methods for optimization. In contrast with previous works, the approach is particularly suited for handling sublinear convergence rates and stochastic oracles. The technique relies on semidefinite programming and potential functions. It allows simultaneously obtaining worst-case guarantees on the behavior of those algorithms, and assisting in choosing appropriate parameters for tuning their worst-case performances. The technique also benefits from comfortable tightness guarantees, meaning that unsatisfactory results can be improved only by changing the setting. We use the approach for analyzing deterministic and stochastic first-order methods under different assumptions on the nature of the stochastic noise. Among others, we treat unstructured noise with bounded variance, different noise models arising in over-parametrized expectation minimization problems, and randomized block-coordinate descent schemes.} }
Endnote
%0 Conference Paper %T Stochastic first-order methods: non-asymptotic and computer-aided analyses via potential functions %A Adrien Taylor %A Francis Bach %B Proceedings of the Thirty-Second Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2019 %E Alina Beygelzimer %E Daniel Hsu %F pmlr-v99-taylor19a %I PMLR %P 2934--2992 %U https://proceedings.mlr.press/v99/taylor19a.html %V 99 %X We provide a novel computer-assisted technique for systematically analyzing first-order methods for optimization. In contrast with previous works, the approach is particularly suited for handling sublinear convergence rates and stochastic oracles. The technique relies on semidefinite programming and potential functions. It allows simultaneously obtaining worst-case guarantees on the behavior of those algorithms, and assisting in choosing appropriate parameters for tuning their worst-case performances. The technique also benefits from comfortable tightness guarantees, meaning that unsatisfactory results can be improved only by changing the setting. We use the approach for analyzing deterministic and stochastic first-order methods under different assumptions on the nature of the stochastic noise. Among others, we treat unstructured noise with bounded variance, different noise models arising in over-parametrized expectation minimization problems, and randomized block-coordinate descent schemes.
APA
Taylor, A. & Bach, F.. (2019). Stochastic first-order methods: non-asymptotic and computer-aided analyses via potential functions. Proceedings of the Thirty-Second Conference on Learning Theory, in Proceedings of Machine Learning Research 99:2934-2992 Available from https://proceedings.mlr.press/v99/taylor19a.html.

Related Material