On the convergence of adaptive first order methods: Proximal gradient and alternating minimization algorithms

Puya Latafat, Andreas Themelis, Panagiotis Patrinos
Proceedings of the 6th Annual Learning for Dynamics & Control Conference, PMLR 242:197-208, 2024.

Abstract

Building upon recent works on linesearch-free adaptive proximal gradient methods, this paper proposes AdaPG, a framework that unifies and extends existing results by providing larger stepsize policies and improved lower bounds. Different choices of the parameters are discussed and the efficacy of the resulting methods is demonstrated through numerical simulations. In an attempt to better understand the underlying theory, its convergence is established in a more general setting that allows for time-varying parameters. Finally, an adaptive alternating minimization algorithm is presented by exploring the dual setting. This algorithm not only incorporates additional adaptivity but also expands its applicability beyond standard strongly convex settings.

Cite this Paper


BibTeX
@InProceedings{pmlr-v242-latafat24a, title = {On the convergence of adaptive first order methods: proximal gradient and alternating minimization algorithms}, author = {Latafat, Puya and Themelis, Andreas and Patrinos, Panagiotis}, booktitle = {Proceedings of the 6th Annual Learning for Dynamics & Control Conference}, pages = {197--208}, year = {2024}, editor = {Abate, Alessandro and Cannon, Mark and Margellos, Kostas and Papachristodoulou, Antonis}, volume = {242}, series = {Proceedings of Machine Learning Research}, month = {15--17 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v242/latafat24a/latafat24a.pdf}, url = {https://proceedings.mlr.press/v242/latafat24a.html}, abstract = {Building upon recent works on linesearch-free adaptive proximal gradient methods, this paper proposes AdaPG, a framework that unifies and extends existing results by providing larger stepsize policies and improved lower bounds. Different choices of the parameters are discussed and the efficacy of the resulting methods is demonstrated through numerical simulations. In an attempt to better understand the underlying theory, its convergence is established in a more general setting that allows for time-varying parameters. Finally, an adaptive alternating minimization algorithm is presented by exploring the dual setting. This algorithm not only incorporates additional adaptivity but also expands its applicability beyond standard strongly convex settings.} }
Endnote
%0 Conference Paper %T On the convergence of adaptive first order methods: Proximal gradient and alternating minimization algorithms %A Puya Latafat %A Andreas Themelis %A Panagiotis Patrinos %B Proceedings of the 6th Annual Learning for Dynamics & Control Conference %C Proceedings of Machine Learning Research %D 2024 %E Alessandro Abate %E Mark Cannon %E Kostas Margellos %E Antonis Papachristodoulou %F pmlr-v242-latafat24a %I PMLR %P 197--208 %U https://proceedings.mlr.press/v242/latafat24a.html %V 242 %X Building upon recent works on linesearch-free adaptive proximal gradient methods, this paper proposes AdaPG, a framework that unifies and extends existing results by providing larger stepsize policies and improved lower bounds. Different choices of the parameters are discussed and the efficacy of the resulting methods is demonstrated through numerical simulations. In an attempt to better understand the underlying theory, its convergence is established in a more general setting that allows for time-varying parameters. Finally, an adaptive alternating minimization algorithm is presented by exploring the dual setting. This algorithm not only incorporates additional adaptivity but also expands its applicability beyond standard strongly convex settings.
APA
Latafat, P., Themelis, A. & Patrinos, P.. (2024). On the convergence of adaptive first order methods: Proximal gradient and alternating minimization algorithms. Proceedings of the 6th Annual Learning for Dynamics & Control Conference, in Proceedings of Machine Learning Research 242:197-208 Available from https://proceedings.mlr.press/v242/latafat24a.html.

Related Material