Lyapunov Functions for First-Order Methods: Tight Automated Convergence Guarantees

Adrien Taylor, Bryan Van Scoy, Laurent Lessard
Proceedings of the 35th International Conference on Machine Learning, PMLR 80:4897-4906, 2018.

Abstract

We present a novel way of generating Lyapunov functions for proving linear convergence rates of first-order optimization methods. Our approach provably obtains the fastest linear convergence rate that can be verified by a quadratic Lyapunov function (with given states), and only relies on solving a small-sized semidefinite program. Our approach combines the advantages of performance estimation problems (PEP, due to Drori and Teboulle (2014)) and integral quadratic constraints (IQC, due to Lessard et al. (2016)), and relies on convex interpolation (due to Taylor et al. (2017c;b)).

Cite this Paper


BibTeX
@InProceedings{pmlr-v80-taylor18a, title = {{L}yapunov Functions for First-Order Methods: Tight Automated Convergence Guarantees}, author = {Taylor, Adrien and Van Scoy, Bryan and Lessard, Laurent}, booktitle = {Proceedings of the 35th International Conference on Machine Learning}, pages = {4897--4906}, year = {2018}, editor = {Dy, Jennifer and Krause, Andreas}, volume = {80}, series = {Proceedings of Machine Learning Research}, month = {10--15 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v80/taylor18a/taylor18a.pdf}, url = {https://proceedings.mlr.press/v80/taylor18a.html}, abstract = {We present a novel way of generating Lyapunov functions for proving linear convergence rates of first-order optimization methods. Our approach provably obtains the fastest linear convergence rate that can be verified by a quadratic Lyapunov function (with given states), and only relies on solving a small-sized semidefinite program. Our approach combines the advantages of performance estimation problems (PEP, due to Drori and Teboulle (2014)) and integral quadratic constraints (IQC, due to Lessard et al. (2016)), and relies on convex interpolation (due to Taylor et al. (2017c;b)).} }
Endnote
%0 Conference Paper %T Lyapunov Functions for First-Order Methods: Tight Automated Convergence Guarantees %A Adrien Taylor %A Bryan Van Scoy %A Laurent Lessard %B Proceedings of the 35th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2018 %E Jennifer Dy %E Andreas Krause %F pmlr-v80-taylor18a %I PMLR %P 4897--4906 %U https://proceedings.mlr.press/v80/taylor18a.html %V 80 %X We present a novel way of generating Lyapunov functions for proving linear convergence rates of first-order optimization methods. Our approach provably obtains the fastest linear convergence rate that can be verified by a quadratic Lyapunov function (with given states), and only relies on solving a small-sized semidefinite program. Our approach combines the advantages of performance estimation problems (PEP, due to Drori and Teboulle (2014)) and integral quadratic constraints (IQC, due to Lessard et al. (2016)), and relies on convex interpolation (due to Taylor et al. (2017c;b)).
APA
Taylor, A., Van Scoy, B. & Lessard, L.. (2018). Lyapunov Functions for First-Order Methods: Tight Automated Convergence Guarantees. Proceedings of the 35th International Conference on Machine Learning, in Proceedings of Machine Learning Research 80:4897-4906 Available from https://proceedings.mlr.press/v80/taylor18a.html.

Related Material