Lazifying Conditional Gradient Algorithms

Gábor Braun, Sebastian Pokutta, Daniel Zink
; Proceedings of the 34th International Conference on Machine Learning, PMLR 70:566-575, 2017.

Abstract

Conditional gradient algorithms (also often called Frank-Wolfe algorithms) are popular due to their simplicity of only requiring a linear optimization oracle and more recently they also gained significant traction for online learning. While simple in principle, in many cases the actual implementation of the linear optimization oracle is costly. We show a general method to lazify various conditional gradient algorithms, which in actual computations leads to several orders of magnitude of speedup in wall-clock time. This is achieved by using a faster separation oracle instead of a linear optimization oracle, relying only on few linear optimization oracle calls.

Cite this Paper


BibTeX
@InProceedings{pmlr-v70-braun17a, title = {Lazifying Conditional Gradient Algorithms}, author = {G{\'a}bor Braun and Sebastian Pokutta and Daniel Zink}, pages = {566--575}, year = {2017}, editor = {Doina Precup and Yee Whye Teh}, volume = {70}, series = {Proceedings of Machine Learning Research}, address = {International Convention Centre, Sydney, Australia}, month = {06--11 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v70/braun17a/braun17a.pdf}, url = {http://proceedings.mlr.press/v70/braun17a.html}, abstract = {Conditional gradient algorithms (also often called Frank-Wolfe algorithms) are popular due to their simplicity of only requiring a linear optimization oracle and more recently they also gained significant traction for online learning. While simple in principle, in many cases the actual implementation of the linear optimization oracle is costly. We show a general method to lazify various conditional gradient algorithms, which in actual computations leads to several orders of magnitude of speedup in wall-clock time. This is achieved by using a faster separation oracle instead of a linear optimization oracle, relying only on few linear optimization oracle calls.} }
Endnote
%0 Conference Paper %T Lazifying Conditional Gradient Algorithms %A Gábor Braun %A Sebastian Pokutta %A Daniel Zink %B Proceedings of the 34th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2017 %E Doina Precup %E Yee Whye Teh %F pmlr-v70-braun17a %I PMLR %J Proceedings of Machine Learning Research %P 566--575 %U http://proceedings.mlr.press %V 70 %W PMLR %X Conditional gradient algorithms (also often called Frank-Wolfe algorithms) are popular due to their simplicity of only requiring a linear optimization oracle and more recently they also gained significant traction for online learning. While simple in principle, in many cases the actual implementation of the linear optimization oracle is costly. We show a general method to lazify various conditional gradient algorithms, which in actual computations leads to several orders of magnitude of speedup in wall-clock time. This is achieved by using a faster separation oracle instead of a linear optimization oracle, relying only on few linear optimization oracle calls.
APA
Braun, G., Pokutta, S. & Zink, D.. (2017). Lazifying Conditional Gradient Algorithms. Proceedings of the 34th International Conference on Machine Learning, in PMLR 70:566-575

Related Material