Fast GradientBased Methods with Exponential Rate: A Hybrid Control Framework
[edit]
Proceedings of the 35th International Conference on Machine Learning, PMLR 80:27282736, 2018.
Abstract
Ordinary differential equations, and in general a dynamical system viewpoint, have seen a resurgence of interest in developing fast optimization methods, mainly thanks to the availability of wellestablished analysis tools. In this study, we pursue a similar objective and propose a class of hybrid control systems that adopts a 2ndorder differential equation as its continuous flow. A distinctive feature of the proposed differential equation in comparison with the existing literature is a statedependent, timeinvariant damping term that acts as a feedback control input. Given a userdefined scalar $\alpha$, it is shown that the proposed control input steers the state trajectories to the global optimizer of a desired objective function with a guaranteed rate of convergence $\mathcal{O}(e^{\alpha t})$. Our framework requires that the objective function satisfies the so called Polyak–{Ł}ojasiewicz inequality. Furthermore, a discretization method is introduced such that the resulting discrete dynamical system possesses an exponential rate of convergence.
Related Material


