Conditional Gradient Methods via Stochastic Path-Integrated Differential Estimator

Alp Yurtsever, Suvrit Sra, Volkan Cevher
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:7282-7291, 2019.

Abstract

We propose a class of variance-reduced stochastic conditional gradient methods. By adopting the recent stochastic path-integrated differential estimator technique (SPIDER) of Fang et. al. (2018) for the classical Frank-Wolfe (FW) method, we introduce SPIDER-FW for finite-sum minimization as well as the more general expectation minimization problems. SPIDER-FW enjoys superior complexity guarantees in the non-convex setting, while matching the best known FW variants in the convex case. We also extend our framework a la conditional gradient sliding (CGS) of Lan & Zhou. (2016), and propose SPIDER-CGS.

Cite this Paper


BibTeX
@InProceedings{pmlr-v97-yurtsever19b, title = {Conditional Gradient Methods via Stochastic Path-Integrated Differential Estimator}, author = {Yurtsever, Alp and Sra, Suvrit and Cevher, Volkan}, booktitle = {Proceedings of the 36th International Conference on Machine Learning}, pages = {7282--7291}, year = {2019}, editor = {Chaudhuri, Kamalika and Salakhutdinov, Ruslan}, volume = {97}, series = {Proceedings of Machine Learning Research}, month = {09--15 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v97/yurtsever19b/yurtsever19b.pdf}, url = {https://proceedings.mlr.press/v97/yurtsever19b.html}, abstract = {We propose a class of variance-reduced stochastic conditional gradient methods. By adopting the recent stochastic path-integrated differential estimator technique (SPIDER) of Fang et. al. (2018) for the classical Frank-Wolfe (FW) method, we introduce SPIDER-FW for finite-sum minimization as well as the more general expectation minimization problems. SPIDER-FW enjoys superior complexity guarantees in the non-convex setting, while matching the best known FW variants in the convex case. We also extend our framework a la conditional gradient sliding (CGS) of Lan & Zhou. (2016), and propose SPIDER-CGS.} }
Endnote
%0 Conference Paper %T Conditional Gradient Methods via Stochastic Path-Integrated Differential Estimator %A Alp Yurtsever %A Suvrit Sra %A Volkan Cevher %B Proceedings of the 36th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Ruslan Salakhutdinov %F pmlr-v97-yurtsever19b %I PMLR %P 7282--7291 %U https://proceedings.mlr.press/v97/yurtsever19b.html %V 97 %X We propose a class of variance-reduced stochastic conditional gradient methods. By adopting the recent stochastic path-integrated differential estimator technique (SPIDER) of Fang et. al. (2018) for the classical Frank-Wolfe (FW) method, we introduce SPIDER-FW for finite-sum minimization as well as the more general expectation minimization problems. SPIDER-FW enjoys superior complexity guarantees in the non-convex setting, while matching the best known FW variants in the convex case. We also extend our framework a la conditional gradient sliding (CGS) of Lan & Zhou. (2016), and propose SPIDER-CGS.
APA
Yurtsever, A., Sra, S. & Cevher, V.. (2019). Conditional Gradient Methods via Stochastic Path-Integrated Differential Estimator. Proceedings of the 36th International Conference on Machine Learning, in Proceedings of Machine Learning Research 97:7282-7291 Available from https://proceedings.mlr.press/v97/yurtsever19b.html.

Related Material