[edit]
ADMM and Accelerated ADMM as Continuous Dynamical Systems
Proceedings of the 35th International Conference on Machine Learning, PMLR 80:1559-1567, 2018.
Abstract
Recently, there has been an increasing interest in using tools from dynamical systems to analyze the behavior of simple optimization algorithms such as gradient descent and accelerated variants. This paper strengthens such connections by deriving the differential equations that model the continuous limit of the sequence of iterates generated by the alternating direction method of multipliers, as well as an accelerated variant. We employ the direct method of Lyapunov to analyze the stability of critical points of the dynamical systems and to obtain associated convergence rates.