Parameter-free Locally Accelerated Conditional Gradients

Alejandro Carderera, Jelena Diakonikolas, Cheuk Yin Lin, Sebastian Pokutta
Proceedings of the 38th International Conference on Machine Learning, PMLR 139:1283-1293, 2021.

Abstract

Projection-free conditional gradient (CG) methods are the algorithms of choice for constrained optimization setups in which projections are often computationally prohibitive but linear optimization over the constraint set remains computationally feasible. Unlike in projection-based methods, globally accelerated convergence rates are in general unattainable for CG. However, a very recent work on Locally accelerated CG (LaCG) has demonstrated that local acceleration for CG is possible for many settings of interest. The main downside of LaCG is that it requires knowledge of the smoothness and strong convexity parameters of the objective function. We remove this limitation by introducing a novel, Parameter-Free Locally accelerated CG (PF-LaCG) algorithm, for which we provide rigorous convergence guarantees. Our theoretical results are complemented by numerical experiments, which demonstrate local acceleration and showcase the practical improvements of PF-LaCG over non-accelerated algorithms, both in terms of iteration count and wall-clock time.

Cite this Paper


BibTeX
@InProceedings{pmlr-v139-carderera21a, title = {Parameter-free Locally Accelerated Conditional Gradients}, author = {Carderera, Alejandro and Diakonikolas, Jelena and Lin, Cheuk Yin and Pokutta, Sebastian}, booktitle = {Proceedings of the 38th International Conference on Machine Learning}, pages = {1283--1293}, year = {2021}, editor = {Meila, Marina and Zhang, Tong}, volume = {139}, series = {Proceedings of Machine Learning Research}, month = {18--24 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v139/carderera21a/carderera21a.pdf}, url = {https://proceedings.mlr.press/v139/carderera21a.html}, abstract = {Projection-free conditional gradient (CG) methods are the algorithms of choice for constrained optimization setups in which projections are often computationally prohibitive but linear optimization over the constraint set remains computationally feasible. Unlike in projection-based methods, globally accelerated convergence rates are in general unattainable for CG. However, a very recent work on Locally accelerated CG (LaCG) has demonstrated that local acceleration for CG is possible for many settings of interest. The main downside of LaCG is that it requires knowledge of the smoothness and strong convexity parameters of the objective function. We remove this limitation by introducing a novel, Parameter-Free Locally accelerated CG (PF-LaCG) algorithm, for which we provide rigorous convergence guarantees. Our theoretical results are complemented by numerical experiments, which demonstrate local acceleration and showcase the practical improvements of PF-LaCG over non-accelerated algorithms, both in terms of iteration count and wall-clock time.} }
Endnote
%0 Conference Paper %T Parameter-free Locally Accelerated Conditional Gradients %A Alejandro Carderera %A Jelena Diakonikolas %A Cheuk Yin Lin %A Sebastian Pokutta %B Proceedings of the 38th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2021 %E Marina Meila %E Tong Zhang %F pmlr-v139-carderera21a %I PMLR %P 1283--1293 %U https://proceedings.mlr.press/v139/carderera21a.html %V 139 %X Projection-free conditional gradient (CG) methods are the algorithms of choice for constrained optimization setups in which projections are often computationally prohibitive but linear optimization over the constraint set remains computationally feasible. Unlike in projection-based methods, globally accelerated convergence rates are in general unattainable for CG. However, a very recent work on Locally accelerated CG (LaCG) has demonstrated that local acceleration for CG is possible for many settings of interest. The main downside of LaCG is that it requires knowledge of the smoothness and strong convexity parameters of the objective function. We remove this limitation by introducing a novel, Parameter-Free Locally accelerated CG (PF-LaCG) algorithm, for which we provide rigorous convergence guarantees. Our theoretical results are complemented by numerical experiments, which demonstrate local acceleration and showcase the practical improvements of PF-LaCG over non-accelerated algorithms, both in terms of iteration count and wall-clock time.
APA
Carderera, A., Diakonikolas, J., Lin, C.Y. & Pokutta, S.. (2021). Parameter-free Locally Accelerated Conditional Gradients. Proceedings of the 38th International Conference on Machine Learning, in Proceedings of Machine Learning Research 139:1283-1293 Available from https://proceedings.mlr.press/v139/carderera21a.html.

Related Material