Lyapunov Functions for FirstOrder Methods: Tight Automated Convergence Guarantees
[edit]
Proceedings of the 35th International Conference on Machine Learning, PMLR 80:48974906, 2018.
Abstract
We present a novel way of generating Lyapunov functions for proving linear convergence rates of firstorder optimization methods. Our approach provably obtains the fastest linear convergence rate that can be verified by a quadratic Lyapunov function (with given states), and only relies on solving a smallsized semidefinite program. Our approach combines the advantages of performance estimation problems (PEP, due to Drori and Teboulle (2014)) and integral quadratic constraints (IQC, due to Lessard et al. (2016)), and relies on convex interpolation (due to Taylor et al. (2017c;b)).
Related Material


