ADMM and Accelerated ADMM as Continuous Dynamical Systems

Guilherme Franca, Daniel Robinson, Rene Vidal
Proceedings of the 35th International Conference on Machine Learning, PMLR 80:1559-1567, 2018.

Abstract

Recently, there has been an increasing interest in using tools from dynamical systems to analyze the behavior of simple optimization algorithms such as gradient descent and accelerated variants. This paper strengthens such connections by deriving the differential equations that model the continuous limit of the sequence of iterates generated by the alternating direction method of multipliers, as well as an accelerated variant. We employ the direct method of Lyapunov to analyze the stability of critical points of the dynamical systems and to obtain associated convergence rates.

Cite this Paper


BibTeX
@InProceedings{pmlr-v80-franca18a, title = {{ADMM} and Accelerated {ADMM} as Continuous Dynamical Systems}, author = {Franca, Guilherme and Robinson, Daniel and Vidal, Rene}, booktitle = {Proceedings of the 35th International Conference on Machine Learning}, pages = {1559--1567}, year = {2018}, editor = {Dy, Jennifer and Krause, Andreas}, volume = {80}, series = {Proceedings of Machine Learning Research}, month = {10--15 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v80/franca18a/franca18a.pdf}, url = {https://proceedings.mlr.press/v80/franca18a.html}, abstract = {Recently, there has been an increasing interest in using tools from dynamical systems to analyze the behavior of simple optimization algorithms such as gradient descent and accelerated variants. This paper strengthens such connections by deriving the differential equations that model the continuous limit of the sequence of iterates generated by the alternating direction method of multipliers, as well as an accelerated variant. We employ the direct method of Lyapunov to analyze the stability of critical points of the dynamical systems and to obtain associated convergence rates.} }
Endnote
%0 Conference Paper %T ADMM and Accelerated ADMM as Continuous Dynamical Systems %A Guilherme Franca %A Daniel Robinson %A Rene Vidal %B Proceedings of the 35th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2018 %E Jennifer Dy %E Andreas Krause %F pmlr-v80-franca18a %I PMLR %P 1559--1567 %U https://proceedings.mlr.press/v80/franca18a.html %V 80 %X Recently, there has been an increasing interest in using tools from dynamical systems to analyze the behavior of simple optimization algorithms such as gradient descent and accelerated variants. This paper strengthens such connections by deriving the differential equations that model the continuous limit of the sequence of iterates generated by the alternating direction method of multipliers, as well as an accelerated variant. We employ the direct method of Lyapunov to analyze the stability of critical points of the dynamical systems and to obtain associated convergence rates.
APA
Franca, G., Robinson, D. & Vidal, R.. (2018). ADMM and Accelerated ADMM as Continuous Dynamical Systems. Proceedings of the 35th International Conference on Machine Learning, in Proceedings of Machine Learning Research 80:1559-1567 Available from https://proceedings.mlr.press/v80/franca18a.html.

Related Material