Lyapunov Functions for First-Order Methods: Tight Automated Convergence Guarantees

[edit]

Adrien Taylor, Bryan Van Scoy, Laurent Lessard ;
Proceedings of the 35th International Conference on Machine Learning, PMLR 80:4897-4906, 2018.

Abstract

We present a novel way of generating Lyapunov functions for proving linear convergence rates of first-order optimization methods. Our approach provably obtains the fastest linear convergence rate that can be verified by a quadratic Lyapunov function (with given states), and only relies on solving a small-sized semidefinite program. Our approach combines the advantages of performance estimation problems (PEP, due to Drori and Teboulle (2014)) and integral quadratic constraints (IQC, due to Lessard et al. (2016)), and relies on convex interpolation (due to Taylor et al. (2017c;b)).

Related Material