Linearly Convergent Frank-Wolfe with Backtracking Line-Search

Fabian Pedregosa, Geoffrey Negiar, Armin Askari, Martin Jaggi
; Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, PMLR 108:1-10, 2020.

Abstract

Structured constraints in Machine Learning have recently brought the Frank-Wolfe (FW) family of algorithms back in the spotlight. While the classical FW algorithm has poor local convergence properties, the Away-steps and Pairwise FW variants have emerged as improved variants with faster convergence. However, these improved variants suffer from two practical limitations: they require at each iteration to solve a 1-dimensional minimization problem to set the step-size and also require the Frank-Wolfe linear subproblems to be solved exactly. In this paper we propose variants of Away-steps and Pairwise FW that lift both restrictions simultaneously. The proposed methods set the step-size based on a sufficient decrease condition, and do not require prior knowledge of the objective. Furthermore, they inherit all the favorable convergence properties of the exact line-search version, including linear convergence for strongly convex functions over polytopes. Benchmarks on different machine learning problems illustrate large performance gains of the proposed variants.

Cite this Paper


BibTeX
@InProceedings{pmlr-v108-pedregosa20a, title = {Linearly Convergent Frank-Wolfe with Backtracking Line-Search}, author = {Pedregosa, Fabian and Negiar, Geoffrey and Askari, Armin and Jaggi, Martin}, pages = {1--10}, year = {2020}, editor = {Silvia Chiappa and Roberto Calandra}, volume = {108}, series = {Proceedings of Machine Learning Research}, address = {Online}, month = {26--28 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v108/pedregosa20a/pedregosa20a.pdf}, url = {http://proceedings.mlr.press/v108/pedregosa20a.html}, abstract = {Structured constraints in Machine Learning have recently brought the Frank-Wolfe (FW) family of algorithms back in the spotlight. While the classical FW algorithm has poor local convergence properties, the Away-steps and Pairwise FW variants have emerged as improved variants with faster convergence. However, these improved variants suffer from two practical limitations: they require at each iteration to solve a 1-dimensional minimization problem to set the step-size and also require the Frank-Wolfe linear subproblems to be solved exactly. In this paper we propose variants of Away-steps and Pairwise FW that lift both restrictions simultaneously. The proposed methods set the step-size based on a sufficient decrease condition, and do not require prior knowledge of the objective. Furthermore, they inherit all the favorable convergence properties of the exact line-search version, including linear convergence for strongly convex functions over polytopes. Benchmarks on different machine learning problems illustrate large performance gains of the proposed variants.} }
Endnote
%0 Conference Paper %T Linearly Convergent Frank-Wolfe with Backtracking Line-Search %A Fabian Pedregosa %A Geoffrey Negiar %A Armin Askari %A Martin Jaggi %B Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2020 %E Silvia Chiappa %E Roberto Calandra %F pmlr-v108-pedregosa20a %I PMLR %J Proceedings of Machine Learning Research %P 1--10 %U http://proceedings.mlr.press %V 108 %W PMLR %X Structured constraints in Machine Learning have recently brought the Frank-Wolfe (FW) family of algorithms back in the spotlight. While the classical FW algorithm has poor local convergence properties, the Away-steps and Pairwise FW variants have emerged as improved variants with faster convergence. However, these improved variants suffer from two practical limitations: they require at each iteration to solve a 1-dimensional minimization problem to set the step-size and also require the Frank-Wolfe linear subproblems to be solved exactly. In this paper we propose variants of Away-steps and Pairwise FW that lift both restrictions simultaneously. The proposed methods set the step-size based on a sufficient decrease condition, and do not require prior knowledge of the objective. Furthermore, they inherit all the favorable convergence properties of the exact line-search version, including linear convergence for strongly convex functions over polytopes. Benchmarks on different machine learning problems illustrate large performance gains of the proposed variants.
APA
Pedregosa, F., Negiar, G., Askari, A. & Jaggi, M.. (2020). Linearly Convergent Frank-Wolfe with Backtracking Line-Search. Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, in PMLR 108:1-10

Related Material