Locally Accelerated Conditional Gradients

Jelena Diakonikolas, Alejandro Carderera, Sebastian Pokutta
Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, PMLR 108:1737-1747, 2020.

Abstract

Conditional gradients constitute a class of projection-free first-order algorithms for smooth convex optimization. As such, they are frequently used in solving smooth convex optimization problems over polytopes, for which the computational cost of projections is prohibitive. However, they do not enjoy the optimal convergence rates achieved by projection-based accelerated methods; moreover, achieving such globally-accelerated rates is information-theoretically impossible. To address this issue, we present Locally Accelerated Conditional Gradients – an algorithmic framework that couples accelerated steps with conditional gradient steps to achieve \emph{local} acceleration on smooth strongly convex problems. Our approach does not require projections onto the feasible set, but only on (typically low-dimensional) simplices, thus keeping the computational cost of projections at bay. Further, it achieves optimal accelerated local convergence. Our theoretical results are supported by numerical experiments, which demonstrate significant speedups over state of the art methods in both per-iteration progress and wall-clock time.

Cite this Paper


BibTeX
@InProceedings{pmlr-v108-diakonikolas20a, title = {Locally Accelerated Conditional Gradients}, author = {Diakonikolas, Jelena and Carderera, Alejandro and Pokutta, Sebastian}, booktitle = {Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics}, pages = {1737--1747}, year = {2020}, editor = {Chiappa, Silvia and Calandra, Roberto}, volume = {108}, series = {Proceedings of Machine Learning Research}, month = {26--28 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v108/diakonikolas20a/diakonikolas20a.pdf}, url = {https://proceedings.mlr.press/v108/diakonikolas20a.html}, abstract = {Conditional gradients constitute a class of projection-free first-order algorithms for smooth convex optimization. As such, they are frequently used in solving smooth convex optimization problems over polytopes, for which the computational cost of projections is prohibitive. However, they do not enjoy the optimal convergence rates achieved by projection-based accelerated methods; moreover, achieving such globally-accelerated rates is information-theoretically impossible. To address this issue, we present Locally Accelerated Conditional Gradients – an algorithmic framework that couples accelerated steps with conditional gradient steps to achieve \emph{local} acceleration on smooth strongly convex problems. Our approach does not require projections onto the feasible set, but only on (typically low-dimensional) simplices, thus keeping the computational cost of projections at bay. Further, it achieves optimal accelerated local convergence. Our theoretical results are supported by numerical experiments, which demonstrate significant speedups over state of the art methods in both per-iteration progress and wall-clock time. } }
Endnote
%0 Conference Paper %T Locally Accelerated Conditional Gradients %A Jelena Diakonikolas %A Alejandro Carderera %A Sebastian Pokutta %B Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2020 %E Silvia Chiappa %E Roberto Calandra %F pmlr-v108-diakonikolas20a %I PMLR %P 1737--1747 %U https://proceedings.mlr.press/v108/diakonikolas20a.html %V 108 %X Conditional gradients constitute a class of projection-free first-order algorithms for smooth convex optimization. As such, they are frequently used in solving smooth convex optimization problems over polytopes, for which the computational cost of projections is prohibitive. However, they do not enjoy the optimal convergence rates achieved by projection-based accelerated methods; moreover, achieving such globally-accelerated rates is information-theoretically impossible. To address this issue, we present Locally Accelerated Conditional Gradients – an algorithmic framework that couples accelerated steps with conditional gradient steps to achieve \emph{local} acceleration on smooth strongly convex problems. Our approach does not require projections onto the feasible set, but only on (typically low-dimensional) simplices, thus keeping the computational cost of projections at bay. Further, it achieves optimal accelerated local convergence. Our theoretical results are supported by numerical experiments, which demonstrate significant speedups over state of the art methods in both per-iteration progress and wall-clock time.
APA
Diakonikolas, J., Carderera, A. & Pokutta, S.. (2020). Locally Accelerated Conditional Gradients. Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 108:1737-1747 Available from https://proceedings.mlr.press/v108/diakonikolas20a.html.

Related Material