Fast Gradient-Based Methods with Exponential Rate: A Hybrid Control Framework

Arman Sharifi Kolarijani, Peyman Mohajerin Esfahani, Tamas Keviczky
Proceedings of the 35th International Conference on Machine Learning, PMLR 80:2728-2736, 2018.

Abstract

Ordinary differential equations, and in general a dynamical system viewpoint, have seen a resurgence of interest in developing fast optimization methods, mainly thanks to the availability of well-established analysis tools. In this study, we pursue a similar objective and propose a class of hybrid control systems that adopts a 2nd-order differential equation as its continuous flow. A distinctive feature of the proposed differential equation in comparison with the existing literature is a state-dependent, time-invariant damping term that acts as a feedback control input. Given a user-defined scalar $\alpha$, it is shown that the proposed control input steers the state trajectories to the global optimizer of a desired objective function with a guaranteed rate of convergence $\mathcal{O}(e^{-\alpha t})$. Our framework requires that the objective function satisfies the so called Polyak–{Ł}ojasiewicz inequality. Furthermore, a discretization method is introduced such that the resulting discrete dynamical system possesses an exponential rate of convergence.

Cite this Paper


BibTeX
@InProceedings{pmlr-v80-kolarijani18a, title = {Fast Gradient-Based Methods with Exponential Rate: A Hybrid Control Framework}, author = {Kolarijani, Arman Sharifi and Esfahani, Peyman Mohajerin and Keviczky, Tamas}, booktitle = {Proceedings of the 35th International Conference on Machine Learning}, pages = {2728--2736}, year = {2018}, editor = {Dy, Jennifer and Krause, Andreas}, volume = {80}, series = {Proceedings of Machine Learning Research}, month = {10--15 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v80/kolarijani18a/kolarijani18a.pdf}, url = {https://proceedings.mlr.press/v80/kolarijani18a.html}, abstract = {Ordinary differential equations, and in general a dynamical system viewpoint, have seen a resurgence of interest in developing fast optimization methods, mainly thanks to the availability of well-established analysis tools. In this study, we pursue a similar objective and propose a class of hybrid control systems that adopts a 2nd-order differential equation as its continuous flow. A distinctive feature of the proposed differential equation in comparison with the existing literature is a state-dependent, time-invariant damping term that acts as a feedback control input. Given a user-defined scalar $\alpha$, it is shown that the proposed control input steers the state trajectories to the global optimizer of a desired objective function with a guaranteed rate of convergence $\mathcal{O}(e^{-\alpha t})$. Our framework requires that the objective function satisfies the so called Polyak–{Ł}ojasiewicz inequality. Furthermore, a discretization method is introduced such that the resulting discrete dynamical system possesses an exponential rate of convergence.} }
Endnote
%0 Conference Paper %T Fast Gradient-Based Methods with Exponential Rate: A Hybrid Control Framework %A Arman Sharifi Kolarijani %A Peyman Mohajerin Esfahani %A Tamas Keviczky %B Proceedings of the 35th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2018 %E Jennifer Dy %E Andreas Krause %F pmlr-v80-kolarijani18a %I PMLR %P 2728--2736 %U https://proceedings.mlr.press/v80/kolarijani18a.html %V 80 %X Ordinary differential equations, and in general a dynamical system viewpoint, have seen a resurgence of interest in developing fast optimization methods, mainly thanks to the availability of well-established analysis tools. In this study, we pursue a similar objective and propose a class of hybrid control systems that adopts a 2nd-order differential equation as its continuous flow. A distinctive feature of the proposed differential equation in comparison with the existing literature is a state-dependent, time-invariant damping term that acts as a feedback control input. Given a user-defined scalar $\alpha$, it is shown that the proposed control input steers the state trajectories to the global optimizer of a desired objective function with a guaranteed rate of convergence $\mathcal{O}(e^{-\alpha t})$. Our framework requires that the objective function satisfies the so called Polyak–{Ł}ojasiewicz inequality. Furthermore, a discretization method is introduced such that the resulting discrete dynamical system possesses an exponential rate of convergence.
APA
Kolarijani, A.S., Esfahani, P.M. & Keviczky, T.. (2018). Fast Gradient-Based Methods with Exponential Rate: A Hybrid Control Framework. Proceedings of the 35th International Conference on Machine Learning, in Proceedings of Machine Learning Research 80:2728-2736 Available from https://proceedings.mlr.press/v80/kolarijani18a.html.

Related Material